Session
Leveraging Local LLMs for the Resource-Constrained Data Practitioner
This one-day hands-on-keyboards working session is for data practitioners curious to explore how Large Language Models (LLMs) can be integrated into their workflows, but who have concerns about cost, security, and complexity.
On their own laptops, starting from scratch, practitioners will learn how to:
* Find, install, configure, and manage LLMs on their own devices
* Integrate those LLMs into a working environment
* And, utilize LLMs in a data workflow
They will also learn:
* How LLMs in general work and what they are (and are not) good for
* LLMOps criteria to consider when deciding whether to use open/closed, local/hosted LLMs
* How to make domain-specific data available to their LLMs
* And, “context engineering” techniques to elicit the best possible results from LLMs
Practitioners will leave the session with:
* A functional local LLM working environment set up on their laptops
* Experience using this environment in a simplified but realistic use case
* Sufficient understanding about how to customize this setup to suit their specific/evolving needs
* And, a basis to continue exploring and refining how they employ LLMs to achieve desired data-centric outcomes
[Note to conference organizers: This is an expanded, hand-on version of a ~40 minute presentation providing an overview of the same subjects.]
Ram Singh
Builder, leader, and consultant exploring how AI models can deliver cheap, effective, secure, and simple solutions.
Nashville, Tennessee, United States
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top