Session
Monitoring Production Grade Agentic Rag Pipelines
It's time to say bye-bye to naive/vanilla RAG systems where we could easily plug in our sample clean data to query using LLM. Several parameters are needed to upgrade from PoC to production, where performance is a key factor in achieving enhanced results. Search and retrieval systems need proper data preprocessing before being ingested into vector databases. Let us head over to take up a few building blocks to set up such an advanced RAG pipeline that can be deployed and scaled in real-time.
Implementing robust and performant RAG systems is the industry's next big goal. Handling multiple operations along with low latency capabilities could present challenges. AI agents have been handy for automating such routing tasks. Observatory tools are the next step for scalability factors, allowing LLM debugging on each encountered step of such workflows. The stack trace helps with app session handling and a deep dive into inferencing flow and outcome.
This helps uniquely implement and augment LLMs in large knowledge bases. Manage and control the data to make informed decisions for your business use cases across BFSI, legal, healthcare, and other domains.
Jayita Bhattacharyya
Hackathon Wizard| Official Code-breaker | Generative AI | Machine Learning | Software Engineer | Traveller
Bengaluru, India
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top