1 / 16

2022: AI/ML Workloads in Containers: 6 Key Facts

Before IT leaders and their teams begin to dig into the nitty-gritty technical aspects of containerizing AI/ML workloads, some principles are worth thinking about up front. Here are six fact to consider. <br>https://www.slideshare.net/WeCodeInc/2022-aiml-workloads-in-containers-6-key-facts

WeCode
Download Presentation

2022: AI/ML Workloads in Containers: 6 Key Facts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2022: AI/ML Workloads in Containers: 6 Key Facts

  2. Introduction: Before IT leaders and their teams begin to dig into the nitty-gritty technical aspects of containerizing AI/ML workloads, some principles are worth thinking about up front. Here are six essentials to consider.

  3. Table of Content ● ● ● ● ● ● AI/ML workloads represent workflows The benefits are similar to other containerized workloads Teams need to be aligned The "pay attention" points don’t really change Containers won’t fix all underlying issues Be smart about build vs. buy

  4. AI/ML Workloads Represent workflows 1.

  5. AI/ML Workloads Represent Workflows “Data gets gathered, cleaned, and processed,” Haff says. Then, the work continues: “Now it’s time to train a model, tuning parameters based on a set of training data. After model training, the next step of the workflow is [deploying to] production. Finally, data scientists need to monitor the performance of models in production, tracking prediction, and performance metrics.” “Traditionally, this workflow might have involved two or three handoffs to different individuals using different environments,” Haff says. “However, a container platform-based workflow enables the sort of self-service that increasingly allows data scientists to take responsibility for both developing models and integrating into applications.”

  6. The Benefits are Similar to Other Containerized Workloads 2.

  7. The benefits are similar to other containerized workloads Nauman Mustafa, head of AI & ML at Autify, sees three overarching benefits of containerization in the context of AI/ML workflows: ● Modularity: It makes important components of the workflow – such as model training and deployment – more modular. This is similar to how containerization can enable more modular architectures, namely microservices, in the broader world of software development. Speed: Containerization “accelerates the development/deployment and release cycle,” Mustafa says. (We’ll get back to speed in a moment.) People management: Containerization also makes it “[easier] to manage teams by reducing cross-team dependencies,” Mustafa says. As in other IT arenas, containerization can help cut down on the “hand off and forget” mindset as work moves from one functional group to another. ● ●

  8. Teams Need to be Aligned 3.

  9. Teams Need to be Aligned “Make sure everyone involved in building and operating machine learning workloads in a containerized environment is on the same page,” says Frank from ISG. “Operations engineers may be familiar with running Kubernetes, but may not understand the specific needs of data science workloads. At the same time, data scientists are familiar with the process of building and deploying machine learning models, but may require additional help when moving them to containers or operating them going forward.” “In a world where repeatability of results is critical, organizations can use containers to democratize access to AI/ML technology and allow data scientists to share and replicate experiments with ease, all while being compliant with the latest IT and InfoSec standards,” says Sherard Griffin, director of global software engineering at Red Hat.

  10. The "Pay Attention" Points Don’t Really Change 4.

  11. The "Pay Attention" Points Don’t Really Change Here are three examples of operational requirements that you’ll need to pay attention to, just like with other containerized applications: ● Resource allocation: Mustafa notes that proper resource allocation remains critical to optimizing cost and performance over time. Provision too much and you’re wasting resources (and money) over time; too little and you’re setting yourself up for performance problems. Observability: Just because you can’t see a problem does not render it out of existence. “Ensure that you have the necessary observability software in place to understand how your multi-container applications behave,” Frank says. Security: “From a security point of view, launching AI/ML solutions is no different from launching other solutions in containers,” Alexandra Murzina, ML engineer at Positive Technologies. That means tactics such as applying the principle of least privilege (both to people and the containers themselves), using only trusted, verified container images, runtime vulnerability scanning, and other security layers should remain top of mind. ● ●

  12. Containers Won’t Fix all Underlying Issues 5.

  13. Containers Won’t Fix all Underlying Issues Just as automation won’t improve a flawed process (it just helps that flawed process run faster and more frequently), containerization is not going to address fundamental problems with your AI/ML workloads. If you’re baking bias into your ML models, for example, running them in containers will do nothing to address that potentially serious issue. “Containers are very beneficial for running AI/ML workloads,” says Raghu Kishore Vempati, director of technology at Capgemini Engineering. “[But] containerizing AI/ML workloads alone doesn’t make the model more efficient. It only provides a way to accelerate the productivity associated with training the models and inferring on them.”

  14. Be Smart About Build vs. Buy 6.

  15. Be Smart About Build vs. Buy As with most technical choices, there’s a “should we or shouldn’t we?” decision in terms of containerizing AI/ML workloads. Also like most important technical choices, nothing comes free. “There is a cost associated with containerizing machine learning workflows, which may not be justified for tiny teams, but for large teams, benefits outweigh the cost,” Mustafa from Autifly says. IT leaders and their teams should do it with clear goals or reasons in mind – “just because we can” shouldn't be the only reason on your list. “Don’t overcomplicate an already complex situation,” Frank says. “Make sure that containerizing ML workloads will provide business value beyond the intellectual exercise.” Source: enterprisersproject

  16. Next-Gen Tech Services Mobile Application Development Cloud Computing Services Quality Assurance Digital Marketing Visit: www.wecode-inc.com Email: sales@wecode-inc.com

More Related