Session
Building AI workflows: from local experiments to serving users
Everyone can throw together an LLM, some MCP tools, and a chat interface, and get an AI assistant we could only dream of a few years back. Add some “business logic” prompts, and you get an AI workflow; hopefully a helpful one.
But how do you take it from a local hack to a production application? Typically, you drown in privacy questions, juggle npx commands for MCPs, and end up debugging OAuth flows before it hopefully starts to make sense.
In this session, we show a repeatable process for turning your local AI workflow experiments into a production-ready deployment using containerized, static configurations.
Whether you prefer chat interfaces or replace them with application UIs, you’ll leave with solid ideas for going from a cool demo to real applications without the existential dread of DevOps.

Oleg Šelajev
Testcontainers, AI, & Developer relations at Docker.
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top