Session

Creating your own LLM from opensource models

From a "simple" finetuning to your own Mixture of Expert model using opensource models.
Nowadays training from scratch an LLM is a so huge effort also for very big company. Starting from pre-trained models to create your own model is no more a way for resourceless companies, but a often a must starting point.

- Lora
- Quantization and QLora
- Injecting embeddinds model into Lora to manage multiple Lora adapters.
- Mixing models
- Creating your MoE (Mixture of experts) model using several finetuned (Your own) models

Sebastiano Galazzo

Artificial intelligence researcher and proud dad

Milan, Italy

View Speaker Profile

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top