Session
Offline Chatbots with WebLLM
Modern web applications rely heavily on cloud-based AI, but this comes with trade-offs: latency, cost, and privacy concerns. What if powerful AI models could run entirely inside the browser, even without an internet connection?
In this talk, we’ll explore WebLLM, a cutting-edge approach to running Large Language Models directly in the browser using WebGPU. You’ll learn how offline-capable chatbots can be built using JavaScript: no servers, no API keys, and no data leaving the user’s device.
This session is designed for web developers curious about the future of privacy-first, offline-ready AI experiences on the web.
Simar Preet Singh
Software Engineer, Redaptive Inc.
Mohali, India
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top