Session

Not All Context Is Helpful: Memory, Retrieval, and State in AI Systems

AI teams often try to improve results by adding more context: longer prompts, bigger retrieval sets, persistent memory, and richer conversation history. But more context does not always make a system better. It can make it slower, noisier, less reliable, harder to secure, and more difficult to reason about.
This session gives engineers and architects a practical framework for thinking about three different design tools in AI systems: retrieval, memory, and application state. Using concrete examples, we’ll compare when information should be passed in the current request, fetched just in time, stored across steps, or deliberately left out. We’ll look at common failure modes such as stale memory, irrelevant retrieval, context overload, and hidden state bugs, then show how these choices affect quality, latency, cost, and maintainability.
Attendees will leave able to explain the difference between memory, retrieval, and state; evaluate when each pattern is appropriate; design AI systems with clearer boundaries; and avoid the common trap of assuming that more context always improves outcomes.

Ron Dagdag

Microsoft AI MVP and Research Engineering Manager @ Thomson Reuters

Fort Worth, Texas, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top