Kosy Ashara
AI Engineer - BNP Paribas
London, United Kingdom
Actions
Kosi builds smarter, faster data platforms for the real world. With a background across data science, engineering, and platform management, Kosi now drives data and AI initiatives at a major investment bank. Focused on scaling systems, streamlining workflows, and pushing the boundaries of what data can do, Kosi is passionate about making AI and infrastructure work better, not just bigger.
Area of Expertise
Topics
Thriving as a Black Woman in Tech: Lessons from Data, AI & Banking
This non-technical talk shares my personal story of breaking into data and AI while working in London’s competitive investment banking industry. I'll highlight the challenges and opportunities of being a Black woman in tech, strategies to overcome bias, and how representation drives innovation.
What attendees will learn:
• The importance of diversity in data and AI fields.
• Practical strategies for women and underrepresented groups to thrive.
• How allies and companies can create inclusive environments.
• Personal insights into navigating corporate tech while staying authentic.
No-Code AI Automation with n8n: Build Smarter Workflows Without Writing a Line of Code
Not everyone needs to write Python or deploy Kubernetes clusters to use AI. With n8n, a no-code/low-code workflow automation tool, anyone can build AI-powered apps and data pipelines visually. In this session, we’ll explore how to integrate LLMs, APIs, and Google Cloud services into automated workflows using n8n — no heavy coding required. Live demos will show use cases like customer support automation, document summarization, and reporting.
Audience Takeaway: Attendees will leave knowing how to get started with no-code AI using n8n, making AI accessible for developers and non-developers alike.
Building Your Personal AI Toolbox: How to Leverage Today’s AI Tools Without Being an Expert
AI is everywhere, but many professionals think they need to be experts to use it. This session is about demystifying AI tools and showing attendees how to build a simple “AI toolbox” for their daily work — whether in data engineering, analytics, or even project management.
What attendees will learn:
• Beginner-friendly AI tools
• How non-data scientists can leverage AI in their workflows.
• Examples of quick wins in business contexts (investment banking, reporting, data cleaning).
• Tips for staying ahead of AI trends without being overwhelmed.
LLMs Are Broken Without MCP: The Next Step in Smarter AI Systems
Large Language Models weren’t built for the real world — they struggle when context grows too fast. Model Context Protocol (MCP) is the missing piece. In this session, you’ll learn how MCP servers unlock smarter context management for LLMs and how Databricks users can integrate this new approach to build faster, more scalable AI workflows.
Building a Retrieval-Augmented Generation (RAG) Chatbot: Making AI Speak Your Data
Chatbots powered by Large Language Models (LLMs) are impressive — but without grounding in real data, they often make things up. Retrieval-Augmented Generation (RAG) provides a solution by combining LLMs with vector databases and search pipelines. This session will walk attendees through designing and deploying a practical RAG chatbot that can answer domain-specific questions using private datasets.
What attendees will learn:
• The architecture of a RAG system (LLM + embedding model + vector database).
• How to ingest and index data for efficient retrieval.
• Building a pipeline to connect a chatbot with enterprise or project-specific data.
• A live demo of a working RAG chatbot.
• Best practices: handling hallucinations, scaling queries, and securing private data.
Prerequisites:
• Intermediate Python experience.
• Basic knowledge of machine learning or databases.
Automating Workflows with Model Context Protocol (MCP): Building API Clients and Servers
The Model Context Protocol (MCP) is a new open standard that allows LLMs to securely connect with external tools, APIs, and data sources. In this session, we’ll demystify MCP by showing how to build your own MCP client and server — enabling AI systems to go beyond text responses and actually perform actions, integrate APIs, and automate data workflows.
What attendees will learn:
• What MCP is and why it matters for the future of AI applications.
• The structure of MCP clients and servers.
• How to expose APIs to an LLM through MCP.
• Demo: creating a simple MCP server to interact with a data pipeline API.
• Use cases in data platforms: automating ETL, monitoring, and reporting with AI.
Prerequisites:
• Familiarity with REST APIs and JSON.
• Intermediate Python or JavaScript knowledge.
AI Guardrails in Practice: Keeping Cloud-Native AI Systems Safe
AI systems are only as good as their guardrails. Without safeguards, they risk hallucinations, biased outputs, or security leaks. This session introduces AI Guardrails and shows how to build them into your cloud-native workflows. We’ll explore validation layers, policy enforcement, and monitoring, using Google Cloud tools and open-source frameworks. Real-world scenarios will highlight how to keep AI systems not just powerful, but also safe and compliant.
Audience Takeaway: Attendees will understand the “why and how” of AI guardrails, and walk away with practical techniques to add safety layers to their AI pipelines.
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top