Session
🤖 Offline AI Magic: Run LLMs Locally with LM Studio & .NET 9
Discover the full power of offline AI in this epic deep-dive session. From understanding why offline LLMs matter for privacy, data sovereignty, and latency to configuring tools like LM Studio, Docker, and .NET 9, JK walks you through everything you need to run local models like a pro.
Jernej Kavka
Microsoft AI MVP, SSW Solution Architect
Brisbane, Australia
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top