Session
Lakeflow Declarative Pipelines Demystified: A Practical Guide to Delta Architecture
This session introduces Lakeflow Declarative Pipelines (formerly Delta Live Tables) and provides a framework for shifting from static to dynamic data engineering, grounded in medallion architecture principles.
We will cover ingestion with Autoloader, configuring Lakehouse Federation and Lakeflow Connectors for external data sources, and evaluating compute options by comparing classic Spark clusters with serverless execution. The session will also address differences between streaming tables and materialized views, data quality frameworks including Expectations, DQX, and Lakehouse Monitoring, and approaches to CI/CD and DataOps for operationalizing Lakeflow workflows.
Participants will gain a clear understanding of how to use Lakeflow Declarative Pipelines to build production-ready workflows, along with practical insights into trade-offs encountered when deploying Databricks in enterprise environments.
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top