Session

Writing a Multi-LLM and Multi-RAG Powered Chatbot using AWS Gen AI

In this discussion, we will examine the AWS GenAI LLM Chatbot. Using this method, you may easily deploy and execute a wide variety of Foundation Models (FMs) in the cloud using Amazon Web Services (AWS).

It is compatible with numerous LLMs and Vector datastores for RAG. Amazon Bedrock (Amazon Titan, Anthropic's Claude, AI21 Jurassic, and Co:Here), Amazon SageMaker, and Amazon Aurora with PGVector, Amazon Open Search, and Amazon Kendra are all examples of these.

Victor Rojo

Principal Global Tech Lead - Conversational AI Partners Competency at AWS

Seattle, Washington, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top