Theo Lebrun
Senior Data Engineer at Ippon Technologies
New York City, New York, United States
Actions
Theo Lebrun is a seasoned Data Engineer and Technology Consultant at Ippon Technologies with a vast experience in cloud computing, databases, and software development. With expertise in a range of tools and technologies including AWS, Kafka, Databricks, and Python, Theo has helped numerous clients optimize their data pipelines meeting and exceeding their goals.
Theo's technical expertise are matched by his passion for sharing his knowledge with others. He has authored multiple blog posts covering and breaking down technical topics such as high-scaled streaming development with Kafka or Databricks Tips and Tricks. He has also presented in multiple conferences such as DevNexus and RVATech Data Summit.
Area of Expertise
Topics
Unlocking AI and Machine Learning at Scale with Snowflake Cortex
AI and machine learning are transforming industries, but building scalable, high-performance ML solutions remains a challenge. Snowflake Cortex simplifies this by enabling users to run AI and ML models directly in the Snowflake Data Cloud.
In this session, I'll dive into how Snowflake Cortex can help you manage large-scale data pipelines, train and deploy models, and streamline your ML workflows—all without the need for complex infrastructure. Whether you're handling massive datasets or deploying AI models, this talk will provide actionable insights to help you accelerate your machine learning initiatives using Snowflake Cortex.
Don’t miss the opportunity to learn how to integrate cutting-edge ML capabilities into your data strategy.
Creating a Modern Web App Using Spring Boot and Vue.js with JHipster
In this talk, I will demonstrate how JHipster can be used to generate in few minutes a modern Web app that uses Spring Boot and Vue.js. JHipster will provide everything you need to start a complete modern Web app and take it to production in a very short amount of time. On top of that, the back-end is built as a high-performance and robust Java stack with Spring Boot. More details can be found here: https://www.jhipster.tech/.
Build next generation Big Data applications with Delta Lake
Delta Lake (https://delta.io/) is an open-source storage framework that enables building a Lakehouse architecture with compute engines including Spark and also APIs for Scala/Java, Python and Rust. A Lakehouse is a modern data architecture that reimagines data warehouses in response to the availability of affordable and highly reliable storage solutions.
Delta Lake provides key benefits and will fit perfectly in your Big Data architecture:
- ACID Transactions
- Schema Evolution
- Time Travel
- Audit History
- Handle petabyte-scale tables
- Platform Agnostic (Cloud, On-prem, or locally)
- DML operations through its SQL and Spark API
In this presentation, I will provide an introduction to Delta Lake, explaining how it works, and its key features and benefits. Whether you're a data scientist, data engineer, or business analyst, this session is for you.
Craft your own Generative AI Chatbot with Amazon Bedrock
In this compelling session, dive deep into the fascinating world of Artificial Intelligence and learn how to craft your very own Generative AI Chatbot using the powerful capabilities of Amazon Bedrock.
Your Chatbot will be able to answer questions based on your own documents. Whether you're a data scientist, data engineer, or business analyst, this session is for you!
By the end of this session, you'll have a comprehensive understanding of the tools and techniques required to build a Generative AI Chatbot on Amazon. Whether you're a seasoned developer or someone just starting their AI journey, this talk equips you with the skills to unleash your creativity and innovation in the realm of Artificial Intelligence.
Building Applications That Can Reason With LangChain
In today's rapidly evolving AI landscape, creating applications that can effectively harness the power of Large Language Models (LLMs) is crucial. In this talk, I will explore LangChain, a powerful framework that enables the seamless integration of LLMs with external data sources and custom workflows.
You’ll learn how to build dynamic data pipelines, enhance language model performance, and develop intelligent applications capable of complex reasoning and contextual responses. Whether you're working with chatbots, content generation, or data-driven decision-making, LangChain unlocks new possibilities for leveraging AI in real-world scenarios.
Join me to discover how LangChain can supercharge your LLM projects!
Devnexus 2023 Sessionize Event
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top