Session
Crafting RAG LLM Applications: Leveraging Azure AI Studio and Prompt Flow with LLMOps
In this session, we’ll explore how to harness the power of Retrieval-Augmented Generation (RAG) for language models. We’ll delve into the technical aspects of developing and deploying RAG systems using Prompt Flow, a streamlined tool for building LLM-based AI applications.
Mert Yeter
Cloud Solutions Architect, 2x Microsoft MVP (AI & Azure)
Istanbul, Turkey
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top