Session

Cloud vs Local: Running Gemini Nano in Chrome

Run Large Language Models (LLMs) locally in your browser! In this session, we'll look at one of the most exciting announcements from Google I/O: you can now run LLMs directly in Chrome using pre-loaded models.

This session will look at the pros and cons of running AI models locally versus in the cloud. We'll talk about how this affects performance, privacy, data security, and cost. You'll also learn how to set up Gemini Nano on Chrome and how it can be used for various applications.

This talk will help you decide when to run AI models locally versus in the cloud. Join us to explore the future of AI in web browsers!

Aldan Creo

Technology Research Specialist @ Accenture Labs

Dublin, Ireland

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top