Session

Evolving from MLOps to LLMOps - Architectures and Best Practices

This session explore how AI operations are evolving from traditional MLOps to the new world of LLMOps. As large language models transform how we build AI systems, we'll break down what is the different and what stays the same when operationalizing these powerful models.

You will learn practical architectures for managing the complete lifecycle, from data preparation and model training to deployment and monitoring. We'll compare standard MLOps workflows with the new requirements of LLMOps, including prompt management, output validation, and cost optimization for large scale models.

Using real world examples with Databricks and MLflow, we will show how to implement these approaches effectively. Whether you're working with traditional machine learning models or cutting edge LLMs, you'll leave with actionable strategies to streamline your AI operations and deployment pipelines

Rajkumar Sakthivel

✨Expert in AI-Powered Ops & App Development | Global Public Speaker | Tesco Technology

London, United Kingdom

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.