Most Active Speaker

Koen Verbeeck

Koen Verbeeck

Senior Consultant @ AE - Data Platform MVP

Rotselaar, Belgium

Koen Verbeeck is a data professional working at AE. He helps organizations to get insight in their data and to improve their analytics solutions. Koen has over a decade of experience in developing data warehouses, models and reports using the Microsoft data platform. Since 2017 Koen is a Microsoft Data Platform MVP. He has a blog at http://www.sqlkover.com, writes articles for MSSQLTips.com and is a frequent speaker at SQL Server events.

Awards

Area of Expertise

  • Information & Communications Technology

Topics

  • Azure Data Factory
  • sql server
  • SQL Server Analysis Services
  • SQL Server Integration Services
  • Azure SQL Database
  • Data Visualization
  • power bi
  • SQL Server Reporting Services
  • master data services

Introduction to Azure Data Factory

Azure Data Factory (ADF) is the tool in the Azure cloud for creating your data pipelines. It allows you to move and transform data from a multitude of sources to as many possible destinations. In this session, we will give an introduction to this product and demonstrate how you can build your first pipeline.
The following topics are covered:
* creating a pipeline and scheduling it for execution
* monitoring the results
* migrating SSIS packages to ADF
* what are mapping and wrangling data flows?

You don't need any special knowledge for this session, but an understanding of the Azure cloud and ETL tools are a plus.

Finding and Fixing T-SQL Anti-Patterns with ScriptDOM

Quality code is free of things we call 'anti-patterns' - nolock hints, using SELECT *, queries without table aliases and so on.
We may also need to enforce certains standards: naming conventions, ending statements with semicolons, indenting code the right way etc. Furthermore, we may need to apply specific configurations on database objects, such as to create tables on certain filegroups or use specific settings for indexes.

All of this may be easy with a small database and a small volume of code to handle, but what happens when we need to deal with a large volume of code? What if we inherit something full of these anti-patterns, and we just don't have time to go through all of it manually and fix it? But suppose we had an automated utility that could do this for us? Even better, if we could integrate it in our Azure Devops pipelines?

ScriptDOM is a lesser-known free tool from SQL Server DacFx which has the ability to help with finding programmatic and stylistic errors (a.k.a linting) in T-SQL code.It can even fix some of these errors!
In this session we will learn about what it is, how we can harness its power to read code and tell us what it finds, and actually fix some of those anti-patterns.
Join us for this highly interactive and demo-packed session for great insights on how to improve the quality of your code. Basic knowledge of T-SQL and Powershell is recommended to get the most out of this session.

feedback link: https://sqlb.it/?7110

Fast Forward your Career by Contributing to the Community

Getting involved in the Data Platform community can be beneficial for your career, both short-term as long-term. Not everyone has the desire to jump on a stage and start speaking for an audience, but luckily you don't need to! There are many options available to build your personal brand out there. In this session, we'll go over these options and see how they can have a great impact on your profession live!

T-SQL Best Practices and Tuning for the BI Developer

When building a data warehouse in the Microsoft Data Platform stack, you typically have to write a fair share of T-SQL. Loading data into the data warehouse, transforming it into the desired format, troubleshooting and debugging, writing source queries for reports; all of those task require some T-SQL and preferably it's fast as well.
In this session, we'll go over some T-SQL best practices from the viewpoint of the BI developer. We'll also cover indexing for the data warehouse and how you can make that SSRS report a bit faster. At the end, you'll walk out with solid knowledge on how to improve your T-SQL skills and deliver results faster. Basic knowledge of T-SQL and SQL Server is assumed.

Practical Data Visualization Best Practices

Data visualization is the next hot topic! Tools like Power BI and Tableau make it easy to create stunning visualizations and dashboards. Creating data visualizations that effectively communicate data can take a bit more effort and consideration. In this session:

* an introduction to the key concepts of data visualization
* what is visual perception and how can it help us?
* why does everyone love pie charts but we shouldn't?
* how can we lie with charts?

The modern Cloud Data Warehouse - Snowflake on Azure

The era of cloud data warehousing has finally begun. Gone are the days where you had to provision expensive servers and spend days configuring and tweaking to get the technical details right. Using cloud infrastructure, we can skip past the technical set-up and start loading data immediately.

Snowflake is a vendor offering a native cloud data warehouse, hosted on Azure. In this session, we'll introduce you to this new technology and explain the important concepts to get you started. Walking out of this session, you'll have all the knowledge you need to embark a project with Snowflake.

Supercharge your data warehouse with Columnstore Indexes

BI solutions nowadays need to be faster, bigger and more real-time. How can your data warehouse handle enormous amounts of data and still have blazing fast performance? In this session we will cover the technology of columnstore indexes:

* what are they and how can you use them?
* how do you get data in them? And out?
* what kind of performance can you expect?

Join Koen in this session to learn how you can make your data warehouse faster!

SQL Server Integration Services Best Practices and Tuning Tips

Did you ever inherited a slow running un-maintainable mess of an SSIS package? Or do you want to knock up your SSIS developing skills to the next level?

In this session we will guide you through common SSIS best practices and performance pitfalls. We will make packages easier to read, maintain and more efficient. You will learn some practical tips to make your life – and that of your successor – easier.

Topics covered in this session:

* How to quickly layout your SSIS packages and how to add documentation
* A few hidden tricks of SSIS
* Best practices for SSIS development
* A couple of quick performance boosters

Join us to learn how to be a top notch SSIS developer.

Self-service BI - Why Data Modelling Is Still Important

Self-service business intelligence is on the rise. Tools like Power BI, Power Query and Power Pivot make it easy for everyone to create a powerful model that can hold millions of rows. However, it's not all cookies and rainbows. To get good performance and correct results, a good data model is required. In this session, we go over the basics of data modelling for business analysis. You will learn to apply a set of guidelines to make your models better, faster and more robust.

Moving Data around in Azure

In the Microsoft Data platform ecosystem in Azure, there are several possibilities to move data around. The choice can be overwhelming and it's not always clear which option should be used.
In this session, we aim to give you an overview of a couple of products in Azure for data pipelines: Azure Data Factory, Logic Apps, Integration Services et cetera. At the end, you'll have a better understanding of the various alternatives in Azure.

Migrating SSIS to the Cloud

Integration Services is a mature ETL product that has been around for more than a decade. A true alternative is still missing in the Azure cloud, but you can migrate your existing projects to Azure. In this session, we'll cover the options you have for your migration project and we'll go into detail on how we can run SSIS packages in Azure Data Factory.

Data Warehouse Automation with metadata driven development. For free!

Working on your 20th staging package? Do you need to import 30 different flat files? Don't have the time to create all those packages?
With Biml - Business Intelligence Markup Language - you can cut back on development time by
* generating your SSIS packages and database object using metadata.
* turning tedious, repetitive work into reusable scripts
* automating general ETL patterns.

In this session, we will go over the basics of Biml and BimlScript and learn how we can use metadata to create a framework for automating your data integration tasks.
After this session, you can start with the automation of your SSIS packages.
A working knowledge of Integration Services package development is necessary for this session.

SQLBits 2022

March 2022 London, United Kingdom

dataMinds Connect 2021

October 2021 Mechelen, Belgium

Techorama 2021 Spring Edition

May 2021 Antwerpen, Belgium

dataMinds Saturday

March 2021

dataMinds Connect 2020 (Virtual Edition)

October 2020 Mechelen, Belgium

CloudBrew 2019 - A two-day Microsoft Azure event

December 2019 Mechelen, Belgium

dataMinds Connect 2019

October 2019 Mechelen, Belgium

DataGrillen 2019

June 2019 Lingen, Germany

Techorama Belgium 2019

May 2019 Antwerpen, Belgium

Dataminds Connect 2018

Beyond the basics - Take your Biml skills to the next level

October 2018 Gent, Belgium

SQLSaturday Holland 2018

Introduction to Biml - Generating your SSIS packages from scratch

October 2018 Utrecht, Netherlands

Koen Verbeeck

Senior Consultant @ AE - Data Platform MVP

Rotselaar, Belgium