© Mapbox, © OpenStreetMap

Speaker

Ram Singh

Ram Singh

Builder, leader, and consultant exploring how AI models can deliver cheap, effective, secure, and simple solutions.

Nashville, Tennessee, United States

Actions

Ram is a technologist and operator who root-cause solves existential enterprise-scale business and product problems while attempting to *not* create entirely new ones. When not leading or consulting cross-functional product development teams in delivering simple, trusted, fun, and useful solutions that delight customers and deliver significant ROI, he’s usually in the guts of yet another startup idea to do the same. Or jeeping and camping with (The RDML Grace) Hopper, a dog. Childhood access to computers blended with classic sci-fi like Asimov’s R. Daneel Olivaw, Simak’s Jenkins, and Adams’ Marvin stories piqued an early study of AI and robotics that perpetuated throughout Ram’s professional career. Like many, he’s been exploring what these latest AI model offerings are *aktshually* good for. Ram also: Operates "id8 inc", his consulting business and innovation incubator; Is developing "coreograf", a federated context-engineering system to improve efficacy & outcomes when humans-and-generative-AI-models collaborate in the development of complex enterprise-scale software; Co-organizes the "Artificial Intelligencers" meetup group.

Area of Expertise

  • Business & Management
  • Information & Communications Technology
  • Media & Information

Topics

  • AI/ML
  • Local LLMs
  • Product Management
  • SLMs
  • Lean Enterprise
  • Lean Startup
  • Lean Product Management
  • complex systems
  • Collective Intelligence
  • Complex Adaptive Systems
  • systems thinking
  • functional programming
  • Elixir

Local LLMs for the Curious Data Practitioner

This up-to-60 minute presentation+Q&A is for data practitioners curious to explore how Large Language Models (LLMs) can be integrated into their workflows while saving costs, maintaining security, and managing complexity.

Practitioners will learn:
* How LLMs in general work and what they are (and are not) good for
* What LLMOps is and the criteria to consider when deciding whether to use open/closed, local/hosted LLMs
* “Context engineering” techniques to elicit the best possible results from LLMs.

Further, they will get an overview of:
* Options for finding, installing, configuring, and managing LLMs on their own devices
* Techniques for making LLMs domain-specific data available to LLMs
* How LLMs can be incorporated into their existing working environment and workflows

Practitioners will leave the session with enough information to begin exploring LLMs in general, and how local LLMs in particular can help them achieve desired data-centric outcomes.

[Note to conference organizers: This is a condensed overview of a one-day workshop that provides hands-on details on the same subjects.]

Leveraging Local LLMs for the Resource-Constrained Data Practitioner

This one-day hands-on-keyboards working session is for data practitioners curious to explore how Large Language Models (LLMs) can be integrated into their workflows, but who have concerns about cost, security, and complexity.

On their own laptops, starting from scratch, practitioners will learn how to:
* Find, install, configure, and manage LLMs on their own devices
* Integrate those LLMs into a working environment
* And, utilize LLMs in a data workflow

They will also learn:
* How LLMs in general work and what they are (and are not) good for
* LLMOps criteria to consider when deciding whether to use open/closed, local/hosted LLMs
* How to make domain-specific data available to their LLMs
* And, “context engineering” techniques to elicit the best possible results from LLMs

Practitioners will leave the session with:
* A functional local LLM working environment set up on their laptops
* Experience using this environment in a simplified but realistic use case
* Sufficient understanding about how to customize this setup to suit their specific/evolving needs
* And, a basis to continue exploring and refining how they employ LLMs to achieve desired data-centric outcomes

[Note to conference organizers: This is an expanded, hand-on version of a ~40 minute presentation providing an overview of the same subjects.]

Ram Singh

Builder, leader, and consultant exploring how AI models can deliver cheap, effective, secure, and simple solutions.

Nashville, Tennessee, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top