Session
Where MLOps Ends, and Edge Container Lifecycle Management Begins
MLOps helps teams design and develop their Machine Learning (ML) models and applications systematically, enabling them to future-proof their AI initiatives. AI models are typically developed and trained in a centralized development environment. However, once ready for production, they often run closer to telemetry data sources, at the edge.
A gap frequently arises at this point.
Most leading MLOps tools cover all essential functions for developing and packaging trained models and applications, but they rarely support the complete lifecycle management of the distributed container applications themselves.
In this session, we delve into the intersection of MLOps and edge container management and operations. We'll discuss where one ends, and the other begins. Moreover, we'll explore how you as an ML developer can efficiently and securely align the lifecycles of the central and distributed components of your edge AI model and application containers.
We'll outline the path toward container application and model management and operations for the future.
Amy Simonson
Marketing Manager of Swedish edge computing company Avassa
Stockholm, Sweden
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top