Session

Build and Run Custom Ollama LLMs Privately for free, high performance and simplicity on your System

You are using LLM through chatbot or API. You are using cloud provider resources
What is Ollama and Advantages
 An open-source tool that allows you to run LLM and AI Models locally from your own machines & hardware.
 Simplifies the process of running large language models locally on your own hardware.

What problem does it solve?
1. Privacy concerns – running LLMs locally provides a lot of security.
2. Ease of use – setting up LLMs can be challenging. With Ollama, it’s an easy process.
3. Cost efficiency – no more cloud-based services (which can be costly)
4. Latency reduction – local execution reduces the latency issues
5. Customization – greater flexibility in customizing our models .
 Allow developers to build applications and features that use AI from your own machine
I will cover in this session:
Introduction to Ollama and Setup
Ollama CLI Commands and the REST API - Hands-on
Ollama - User Interfaces for Ollama Models
Intergreate Ollama with SQL Server 2025 and PostgreSQL:
Create embeddings & vectore search om your data
Building LLM Applications with Ollama Models
This is a demo session only with minimum theory & lessons from the fields!

Yitzhak David

DBA BDA POWER BI NOSQL K8s

Nahariyya, Israel

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top