Session

From Zero to Database-backed Support Bot - Using new GenAI Stack- Docker, LangChain, Ollama & Neo4j

With the breakthrough of large language models, generative AI capabilities are now possible for every developer. But where to start?

In a partnership between Docker, Neo4j, LangChain, and Ollama we created a GenAI Stack for building database-backed GenAI applications.

With a single "docker-compose up," you get them up and running and can start importing data and creating vector embeddings, as well as using an example chatbot application to answer natural language questions using a combination of a Large Language Model and a Knowledge Graph.

In this session, we will look behind the scenes into the containers of the GenAI Stack, how they work together and how the LangChain and Streamlit Python apps are implemented. We will use data from StackOverflow so you can fetch topics that you're interested in.

But we will not stop there! Based on the existing code, we will build and run our own GenAI app that extends the existing functionality, and thanks to the quick Docker setup, you can code along.

This should give you the setup, confidence, and convenience to get going with your first applications that use large language models to speed up your time to development.

Alison Cossette

Data Science Strategist, Advocate, Educator

Burlington, Vermont, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top