Session

DevOps for AI: running LLMs in production with Kubernetes and KubeFlow

Explore the essentials of deploying and managing large language models (LLMs) in production environments using Kubernetes and KubeFlow. As AI and LLMs transition from experimental phases to business-critical applications, this session provides best practices, architectural design insights, and hands-on demonstrations to streamline AI workflows, ensure scalability, and maintain reliability. Ideal for developers and DevOps professionals, this talk will enhance your AI deployment strategies and operational efficiency in real-world business scenarios.

Aarno Aukia

Partner at VSHN - The DevOps Company

Zürich, Switzerland

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top