Session
From Zero to Fully Local AI: Building a Working LLM App with Foundry Local
Large language models don’t always need the cloud. In this session, we build a fully local LLM application from scratch using Microsoft Foundry Local, exposing a local, OpenAI-compatible API and integrating it into a working app with no data leaving the device. Attendees will learn what on-device AI can realistically do today, the trade-offs involved, and when local, cloud, or hybrid architectures make sense.
Patrick Wahlmueller
Turning AI, Automation & Microsoft Technologies into real-world impact
Linz, Austria
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top