Session

Getting started with GenAI on Kubernetes

In this workshop we will build our first internal environment to run, orchestrate and integrate LLMs and required components for RAG and performance improvements.

Layer by layer we will setup LiteLLM, OpenWebUI, and langfuse. As it is not enough to just handle managed LLMS we will then extend the environment mit vLLM and kserve to run our own oss-gpt.

The given platform not only allows you to have fun with LLMs and provide a unified API for your teams to work with LLMs, but we will also show you how you can fulfill compliance and governance requirements with the given platform.

Max Körbächer

Founder & Technology Advisor @ Liquid Reply

Munich, Germany

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top