头图

The original text comes from infosecurity

Author: Rebecca James

Compiled by JD Cloud Developer Community

At present, the boom of digital transformation is in full swing in the IT field, more and more enterprises are involved in it, and the integration of modern technologies such as machine learning and artificial intelligence is gradually becoming popular within the company organization.

As the technologies that make up an enterprise's complex IT infrastructure mature, deploying a cloud-native environment and using containers in that environment has long been part of the enterprise tech track.

Fortunately for business owners, Kubernetes and container deployment technologies can not only go hand-in-hand with machine learning, but can also be brought into a cloud-native model, providing many benefits to the business, including implementing effective business policies as well as security 's cultivation.

When we talk about machine learning, what comes to your mind? Machine use cases are diverse - from simple fraud/cybercrime detection, to tailor-made customer experiences, to complex operations like supply chain optimization, it's all a testament to what machine learning can bring to your business huge profits.

In addition, Gartner's forecast further demonstrates the numerous advantages offered by machine learning, which states that by 2021, 70% of enterprises will rely on some form of artificial intelligence.

The application of artificial intelligence in business

For businesses to take full advantage of artificial intelligence and machine learning and apply them to new business groups such as DevOps and DevSecOps, they must have a solid IT infrastructure.

A robust IT environment can provide data scientists with an environment to experiment with various datasets, computational models and algorithms without affecting other operations and without costing IT staff.

To effectively implement machine learning in business, enterprises need to find a way to repeatedly deploy code in both on-premises and cloud environments, and establish connections to all required data sources.

For modern businesses, time is an essential tool to help them achieve their goals, so they desperately need an IT environment that supports rapid code development.

Speaking of containers, containers speed up the deployment process of enterprise applications by wrapping code and its specific requirements to run in "packages", a feature that makes containers ideal for enterprises, and therefore machine learning and The ideal partner for artificial intelligence.

To sum up, the three phases of an AI project in a container-based environment, including exploration, model training, and deployment, are very promising. What exactly does each stage include? These three stages are explained below.

01 Explore <br>When building AI models, the norm that data scientists follow is to try different datasets as well as various ML algorithms to determine which datasets and algorithms to use so that they can improve their predictive level efficiency and accuracy.

Typically, data scientists rely on a large number of libraries and frameworks to create ML models for various situations and problems in different industries. Data scientists also need the ability to run tests and execute them quickly as they try to discover new revenue streams and work toward an enterprise's business goals.

While the use of AI technology is changing with each passing day, there is already data showing that companies that enable data scientists and engineers to develop using containerization have an advantage over their competitors.

Canadian web hosting provider HostPapa outperformed other leading web hosting providers thanks to its early adoption of Kubernetes, according to a report by Ottawa DevOps engineer Gary Stevens.

Incorporating containers in the exploratory phase of an AI or ML project enables data teams to freely package libraries according to their specific domain; deploy algorithms accordingly, and identify the right data sources based on team needs.

With the successful implementation of container-based programs such as Kubernetes, data scientists have access to isolated environments. This allows them to customize the exploration process without having to manage multiple libraries and frameworks in a shared environment.

02 Model Training <br>After designing the model, the data scientist needs to use a large amount of data to train the AI program across platforms to maximize the accuracy of the model and reduce the use of any human resources.

Considering that training AI models is a highly computationally intensive operation, containers are proving to be very beneficial for scaled workloads and fast communication with other nodes. Often, however, a member of the IT team or scheduler determines the best nodes.

In addition, data training with modern data management platforms through containers greatly affects and simplifies the data management process in AI models. In addition, data scientists have the advantage of running AI or ML projects on many different types of hardware, such as GPUs, which also allows them to always use those hardware platforms that are most accurate.

03 Deployment <br>As the trickiest part of an AI project, in the production and deployment phases of a machine learning application there can often be a combination of multiple ML models, each with a different purpose.

By incorporating containers in ML applications, IT teams can deploy each specific model as a separate microservice. So, what are microservices? A microservice is a self-contained, lightweight program that developers can reuse in other applications.

Not only do containers provide a portable, isolated, and consistent environment for rapidly deploying ML and AI models, they also have the potential to change today's IT landscape by enabling businesses to achieve their goals faster and better.

Original link: https://www.infosecurity-magazine.com/opinions/kubernetes-containers-machine/


京东云开发者
3.4k 声望5.4k 粉丝

京东云开发者(Developer of JD Technology)是京东云旗下为AI、云计算、IoT等相关领域开发者提供技术分享交流的平台。