Session
Modern Data Engineering with Lakeflow Declarative Pipelines & Databricks Orchestration
Cloud platforms continue to evolve, AI raises the bar, and modern data stacks demand engineers who can build pipelines that are scalable, cost-efficient, and resilient. The one that covers all this and is fastest to the finish line wins. This abbreviated hands-on workshop takes you from fundamentals to production-ready practices using Lakeflow Connect to get your data from source to Unity Catalog quickly and cheaply.
You’ll learn how to ingest raw data with Lakeflow, orchestrate and monitor workloads, and apply real-world techniques that save time and money. By the end of the session, you’ll have walked away with frameworks you can apply immediately to your organization’s projects.
What You’ll Learn
-How to design and implement Lakeflow Connect Pipelines.
-How to orchestrate, schedule, and monitor your workloads in Databricks.
-A framework for evaluating new tools and practices in a rapidly changing, AI‐driven world.
Chris Gambill
Founder | Gambill Data | Fractional Data Strategist and Leader
Knoxville, Tennessee, United States
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top