Business Intelligence Data Warehousing Data Integration ETL Azure Data Platform Azure Data Factory Microsoft Data Platform SQL Server Integration Services SSIS Biml BimlScript SQL Server SQL
As Data Engineers and ETL Developers, our main responsibilities are to ingest, store, transform, integrate, and prepare data for our end users as quickly and efficiently as possible. With the ever-increasing volume and variety of data, this can often feel like a daunting task.
Azure Data Factory (ADF) is a hybrid data integration service that lets you build, orchestrate, and monitor complex and scalable data pipelines - without writing any code!
In this session, we will first go through the fundamentals of Azure Data Factory and see how easy it is to build powerful data pipelines. Then, we will explore some of the major improvements in Azure Data Factory v2, including Mapping and Wrangling Data Flows for creating visual data transformations. Finally, we will look at how to trigger and schedule our data pipelines, and how to monitor our solution once it has been deployed.
Session Level: 100-200 / Beginner-Intermediate
Session Length: 45-75 minutes
Prerequisites: Should be familiar with Data Integration and/or ETL concepts and scenarios.
Cathrine loves data and coding, as well as teaching and sharing knowledge :) She is based in Norway and works as a Senior Business Intelligence Consultant in Inmeta, focusing on Data Integration and Data Warehousing. Her core skills are Azure Data Factory, Azure Synapse Analytics, SSIS, Biml and T-SQL development, but she enjoys everything from programming to data visualization. Outside of work she’s active in the Azure and Microsoft Data communities as a Microsoft Data Platform MVP, international speaker, blogger, organizer, and chronic volunteer. She blogs at cathrinew.net and tweets at @cathrinew.
Cathrine is based in Norway, but loves traveling and speaking internationally.