Session
LLM-Driven Applications: Intro to LLM Agents in Kubernetes
Over the past year, there has been a lot of important work and discussions around how to run Large Language Models (LLMs) on top of Kubernetes. But what happens once you have your LLM running in your cluster? How can you build an application around your LLM, where the LLM can act as an agent and call out to various services you have deployed in your cluster to solve complex tasks that would otherwise cause the LLM to hallucinate? Is there a way to do this without making changes to the services and deployments already in your cluster?
In this talk I will be exploring what an LLM agent is and how you build an agent that will call out to your services. I will show this with a demo of an LLM agent running in a kubernetes cluster that is able to automatically detect new tools and agents deployed to the cluster and coordinate them to complete complex tasks in a conversation with the user.
Calum Murray
CNCF Ambassador and Software Engineer at Red Hat
Toronto, Canada
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top