Session

Distilling GPT-5.1: Training an Open-Source Student with an Azure LLM Teacher

Large models are powerful, but expensive to run, slow to iterate, and hard to ship. Distillation lets you transfer intelligence from a hosted Azure GPT-5.1 model into a lightweight open-source student that you fully control. In this talk, I’ll show how to generate structured teaching data, design effective prompts and evaluation loops, and fine-tune a smaller model that preserves quality while cutting cost and latency. Walk away with a practical recipe for bringing enterprise-grade reasoning into models you can deploy anywhere.

Gary Short

I predict the future.

Dundee, United Kingdom

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top