Session
Streamlining Databricks: CI/CD your Notebooks with DevOps Pipelines and orchestrate via Data Factory
Integrating Azure Databricks with Azure Data Factory (ADF) allows you to seamlessly orchestrate data workflows and execute Databricks notebooks within your data pipelines, this integration empowers you to leverage the strengths of both, while Databricks provides scalable analytics and machine learning capabilities, ADF complements this by enabling you to schedule, trigger, and manage these data movement and transformations.
In this session I'll show you how to 1) Provision your Workspace infrastructure with BICEP, 2) Configure SCIM provisioning connector and use Azure's Entra ID security groups to separate users from administrators, 3) deploy your Notebooks across environments with DevOps YAML Pipelines and 4) Orchestrate your Notebooks via Azure Data Factory.
Hector Sven
Data Engineering Manager at Avanade Norway, Azure Certified Pro & DevOps Expert
Oslo, Norway
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top