Session

Semantic Memory for LLMs: Graph-Based Retrieval and Reasoning with Node.js & Neo4j

Large Language Models excel at generating fluent text — but they still lack memory: the ability to store knowledge structurally, recall relationships, reason over connections, and explain how they arrived at an answer.

In this talk, we explore how Semantic Memory emerges when LLMs are combined with Knowledge Graphs, enabling deep contextual understanding and multi-hop reasoning impossible with traditional RAG.

Using Node.js + LangChain + LangGraph + Neo4j, we build an end-to-end Graph-Based Retrieval system that extracts entities and relations from unstructured data, stores them as a dynamic knowledge graph, and enables precise, explainable reasoning through GraphRAG-style pipelines.

Attendees will learn how graph-backed memory transforms LLMs from pattern generators into context-aware agents capable of semantic recall, logical inference, and enterprise-grade reliability.

Luiz Calaça

Software Engineer, Data Scientist and Professor

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top