Session

Goodbye, Latency: Building Real-Time AI with gRPC

We’ve all been there: you build a brilliant AI feature, but users complain it’s frustratingly slow.
This session is for anyone who has faced this problem. We're going to take on latency itself and show you how to win. I will take you behind the scenes as we build and transform an "AI Mentor" for a live, hands-on learning platform. In the live demo, we will see the performance of slow REST/JSON implementation and replace it with a high-performance gRPC backbone. We'll compare the two approaches side-by-side and witness the transformation from a laggy prototype to a seamless, real-time experience.
Key Takeaways:
1. Use gRPC's bidirectional streaming to create a fluid, "always-on" conversation between your users and your AI.
2. Leverage Protobuf to drastically shrink your data payloads and achieve near-instant communication.
3. Structure your own applications to solve the latency problem

Abhinav Sharma

Site Reliability Engineer at KodeKloud | Microsoft MVP | GSOC @OpenSUSE | GitHub Campus Expert

Jaipur, India

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top