Cathrine Wilhelmsen

Information & Communications Technology

Business Intelligence Data Warehousing Data Integration ETL Azure Data Platform Azure Data Factory Microsoft Data Platform SQL Server Integration Services SSIS Biml BimlScript SQL Server SQL

Oslo, Norway

Cathrine Wilhelmsen

Senior BI Consultant at Inmeta | Data Platform MVP

Cathrine loves teaching and sharing knowledge. She is based in Norway and works as a Senior Business Intelligence Consultant in Inmeta, focusing on Data Warehousing, Data Integration, Analytics, and Reporting projects. Her core skills are Azure Data Factory, SSIS, Biml and T-SQL development, but she enjoys everything from programming to data visualization. Outside of work she's active in the SQL Server community as a Microsoft Data Platform MVP, BimlHero Certified Expert, author, speaker, blogger, organizer and chronic volunteer.


Cathrine is based in Norway, but loves traveling and speaking internationally.

Current sessions

Biml for Beginners: Script and Automate SSIS Development

Are you tired of creating and updating the same SSIS packages over and over and over again? Is your wrist hurting from all that clicking, dragging, dropping, connecting and aligning? Do you want to take the next step and start automating your SSIS development?

Say goodbye to repetitive work and hello to Biml, the markup language for Business Intelligence projects.

In this session, we will first look at the basics of Biml and how to automatically generate SSIS packages from database metadata. Then we will explore techniques for reusing code and implementing changes across projects with just a few clicks. Finally, we will create an example project that you can download and start with to generate all the SQL scripts and SSIS packages needed to build a staging environment in just a few minutes.

Stop wasting your valuable time on doing the same things over and over and over again, and see how you can complete in a day what once took more than a week!

Session Level: 100-200 / Beginner-Intermediate
Session Length: 45-75 minutes
Prerequisites: Must have experience with basic SSIS development. Should have knowledge about Data Warehousing and ETL concepts.


Level Up Your Biml: Best Practices and Coding Techniques

Is your Biml solution starting to remind you of a bowl of tangled spaghetti code? Good! That means you are solving real problems while saving a lot of time. The next step is to make sure that your solution does not grow too complex and confusing - you do not want to waste all that saved time on future maintenance!

Attend this session for an overview of Biml best practices and coding techniques. Learn how to improve and simplify your solution by using some common and some lesser-known Biml features. If standard Biml is not enough, you can implement custom logic by creating your own C# classes and methods. Finally, see how to bring everything together in an example project for creating and loading a data mart with facts and dimensions.

Start improving your code today and level up your Biml in no time!

Session Level: 300-400 / Intermediate-Advanced
Session Length: 60-90 minutes
Prerequisites: Must have experience with Biml development.


Biml Tips and Tricks: Not just for SSIS packages!

"Wait, what? Biml is not just for generating SSIS packages?"

Absolutely not! Come and see how you can use Biml (Business Intelligence Markup Language) to save time and speed up other Data Warehouse development tasks. You can generate complex T-SQL statements with Biml instead of using dynamic SQL, create test data, and even populate static dimensions.

Don't Repeat Yourself, start automating those boring, manual tasks today!

Session Level: 200-300 / Intermediate
Session Length: 45-75 minutes
Prerequisites: Must have experience with Biml development.


Uhms and Bunny Hands: Tips for Improving Your Presentation Skills

Are you considering becoming a speaker, but feel nervous about getting on stage for the first time? Have you already presented a few sessions and want advice on how to improve? Do you learn more from seeing examples of what you should NOT do during a presentation instead of reading a list of bullet points on how to become a better speaker?

Don't worry! I have made plenty of presentation mistakes over the years so you won't have to :)

In this session, we will go through common presentation mistakes and how you can avoid them, as well as how you can prepare for those dreaded worst-case scenarios. Don't let those "uhms" and "uhhs" dominate your presentation, help the audience focus on the key message you're delivering instead of making them read a wall of text in your slides, recover gracefully from any demo failures, and stop distracting your attendees with floppy bunny hands.

All it takes is a little preparation and practice. You can do this!

Session Level: 100 / Beginner
Session Length: 30-75 minutes
Prerequisites: None :)


Packages or Pipelines? Azure Data Factory for the SSIS Developer

Azure Data Factory (ADF) is a hybrid data integration service that lets you build, orchestrate, and monitor complex and scalable data pipelines - without writing any code! The first version of Azure Data Factory may not have lived entirely up to its nickname "SSIS in the Cloud", but the second version has been drastically improved and expanded with new capabilities.

But wait, what's that? You have already invested years and millions in a comprehensive SSIS solution, you say? No problem! You can lift and shift your existing SSIS packages into Azure Data Factory to start modernizing your solution while retaining the investments you have already made.

In this session, we will first go through the fundamentals of Azure Data Factory and see how easy it is to build powerful data pipelines or migrate existing SSIS packages. Then, we will explore some of the major improvements in Azure Data Factory v2, including Mapping and Wrangling Data Flows for creating visual data transformations. Finally, we will look at how to trigger and schedule our packages or pipelines, and how to monitor our solution once it has been deployed.

Session Level: 100-200 / Beginner-Intermediate
Session Length: 45-90 minutes
Prerequisites: Must have experience with SSIS development and administration. Should be familiar with Data Integration and/or ETL concepts and scenarios.


Pipelines and Transformations: Introduction to Azure Data Factory

As Data Engineers and ETL Developers, our main responsibilities are to ingest, store, transform, integrate, and prepare data for our end users as quickly and efficiently as possible. With the ever-increasing volume and variety of data, this can often feel like a daunting task.

Azure Data Factory (ADF) is a hybrid data integration service that lets you build, orchestrate, and monitor complex and scalable data pipelines - without writing any code!

In this session, we will first go through the fundamentals of Azure Data Factory and see how easy it is to build powerful data pipelines. Then, we will explore some of the major improvements in Azure Data Factory v2, including Mapping and Wrangling Data Flows for creating visual data transformations. Finally, we will look at how to trigger and schedule our data pipelines, and how to monitor our solution once it has been deployed.

Session Level: 100-200 / Beginner-Intermediate
Session Length: 45-75 minutes
Prerequisites: Should be familiar with Data Integration and/or ETL concepts and scenarios.


Building Dynamic Data Pipelines in Azure Data Factory

You already know how to build, orchestrate, and monitor data pipelines in Azure Data Factory. But how do you go from basic, hardcoded data pipelines to making your solution dynamic and reusable?

In this session, we will dive straight into some of the more advanced features of Azure Data Factory. How do you parameterize your linked services, datasets, and pipelines? What is the difference between parameters and variables, and when should you use them? And how does the expression language and built-in functions really work?

We will answer these questions by going through an existing solution step-by-step and gradually making it dynamic and reusable. Along the way, we will cover best practices and lessons learned.

Additional Information
Session Level: 300 / Intermediate
Session Length: 45-75 minutes
Prerequisites: Must have experience with Azure Data Factory development.


Creating Visual Data Transformations in Azure Data Factory

Azure Data Factory v2 came with many new capabilities and improvements. One of the biggest game-changers was the Data Flows feature, allowing you to transform and prepare data at scale - without having to write a single line of code!

First came the Mapping Data Flows, built for data transformation and known-schema to known-schema mapping. Then came the Wrangling Data Flows, built for data preparation and exploration. After that came ALL the questions :) What is the difference between the two types of Data Flows? How are they different from Power BI dataflows? Who should use Mapping, and who should use Wrangling, and when, and why, and how?

In this session, we will go through the capabilities and use cases for both Mapping and Wrangling Data Flows. For Mapping Data Flows, we will dig deeper into the various transformations available, as well as the expression language and how to use the visual expression builder. Finally, we will look at how to debug, monitor, and optimize our data transformations.

Session Level: 200 / Intermediate
Session Length: 45-75 minutes
Prerequisites: Attendees should be familiar with Data Integration and/or ETL scenarios. Experience building solutions in SQL Server Integration Services (SSIS) and/or Azure Data Factory (ADF) can be helpful, but not required.


Understanding Azure Data Factory Pricing

Azure Data Factory pricing is easy, right? No upfront costs. Pay only for what you use. It's a wonderful world for developers. Just a few clicks and your solution is ready for production!

But what do you present to management when they ask for cost estimates? "I guess we just have to wait for the next invoice" is rarely an acceptable answer. It is definitely not an acceptable answer if that invoice turns out to be unexpectedly high.

Bring your calculator! In this session, we will go through some common ETL patterns to explain the Azure Data Factory pricing model. Once we have our monthly estimates, we will discuss which patterns are good candidates - and which patterns you probably want to rethink before deploying to production...

Session Level: 100 / Beginner
Session Length: 20-30 minutes
Prerequisites: Should be familiar with Azure Data Factory concepts.


Past and future events

dataMinds Connect 2019

6 Oct 2019 - 7 Oct 2019
Mechelen, Belgium

Data Saturday Holland

4 Oct 2019 - 4 Oct 2019
Utrecht, Netherlands

Techorama Netherlands 2019

30 Sep 2019 - 1 Oct 2019
Ede, Netherlands

DATA:Scotland 2019

12 Sep 2019 - 12 Sep 2019
Glasgow, United Kingdom