
Benni De Jagere
No coffee? No insights!
Zoersel, Belgium
Benni is a Senior Program Manager in the Power BI Customer Advisory Team (PBICAT) at Microsoft. Aspiring to be top notch in his field, through continuous personal development, on both technical and soft skills. He strives for maximum results in his tasks using team play, communication, thinking outside of the box, and (endless) motivation. Benni is always ready to tackle the unknown, or pick up fresh ideas to broaden his range. Building on past experiences, he continuously tries to find new, more efficient ways of obtaining results, and improving the process along the way.
Loving (almost) every day of it, he’s fascinated by the value of data, sometimes flabbergasted by the lack of awareness, and intrigued by the endless possibilities whilst discovering new ways of looking at data. He thrives on unfolding new insights for customers whilst using an open and transparent communication.
On a daily basis he turns (large amounts of) coffee into insights for customers, and references witty British comedy, lame dad jokes, and obscure facts way too often. Overly enthusiastic about anything data related, he’s trying hard to keep up with all things new and shiny.
When not working, blogging or reading, you’ll likely find him out and about being his weird self. Rumour has it that he’s also involved with a ragtag band of data enthusiasts, enjoying themselves whilst organising cool community things.
They go by the name of .. dataMinds!
Links
Awards
Area of Expertise
Topics
You're a Power BI Admin? Let's get your threads aligned for Fabric!
May 23rd 2023 will be remembered by many as a day that shook the Data & AI ecosystem for Microsoft, potentially even beyond. Many of the Power BI Admins lived their life of dealing with the woes and worries of a reporting environment, until that very day a few things were made public. With the announcement of Fabric, an all-in-one Analytics solutions, covering a whole new range of possibilities, this means Power BI Admins have a lot more ground to cover. But how will they do it? How will they keep up with all the new options and responsibilities.
This session will focus on the life of a Power BI Admin moving over to the extended Fabric realm, and keeping their things in check. We'll look at the familiar topics of capacity and workspace management, and how they align with the new workloads that have been introduced. We'll notice there's a very large overlap, as Fabric has been built on the foundations of Power BI.
To wrap up, we'll look at some of the new tools that were introduced to help you keep an overview of the tenant, and make sure the right data gets to the right people. This means getting an inventory of our tenant settings, and setting up alerts for when something changes, working with the newly introduced Admin Monitoring workspace to have an easy overview of what is happening where.
Walking out of this session, a Power BI Admin should feel more confident about Fabric, and be convinced it can actually make their life easier.
Using Lakehouse Data at scale with Power BI, featuring Direct Lake mode
Many companies have invested heavily in building data lakes to store large volumes of structured and unstructured data from various sources into Delta Parquet files. These Delta Parquet files can be used for a wide range of Analytics and Business Intelligence applications. Most of these organizations struggle to derive insights from their investments due to the complexity of accessing and querying the data, and how to let self-service users connect to this data in the lake using Power BI.
With the introduction of Microsoft Fabric, an all-in-one analytics solution for enterprises, we now have a better approach for this. In this session, we will explore how to use Lakehouse data at scale with Power BI, using the new Direct Lake connectivity mode. Power BI Direct Lake combines the best of both worlds from Import and DirectQuery mode, and gives us the option for great performance over data in the lake, without introducing additional latency for dataset refreshes.
We will start by discussing the benefits of the Lakehouse architecture and how it can improve data management and analytics. We will then move on to explore how to connect to Lakehouse data using Power BI by combining both of these architecture components and using each of them to their strengths.
We will also cover best practices for optimizing performance when working with large volumes of data, including using data partitioning and query optimization techniques. We will demonstrate how to use Power BI to analyze Lakehouse data in real-time and how to build reports that provide actionable insights for decision-making.
By the end of the session, attendees will have a solid understanding of how to leverage Lakehouse data at scale with Power BI and how to build powerful analytics solutions that can handle massive amounts of data. Whether you are a data analyst, data scientist, or BI professional, this session will provide you with valuable insights into the world of Lakehouse data and Power BI, featuring the new Direct Lake connectivity mode.
Pie Charts. Friend or Foe?!
"Pie charts suck!", "Never use pie charts!", or "Can you replace that hideous pie chart?" are probably some of the sentences you've heard when you've proudly presented your latest report to make the corporate data more digestible, or when you've ventured on the wondrous wide world web.
Pie charts have often been at the center of heated conversations between many data visualisation enthusiasts and/or data artists. But do they really need to be "killed with fire", or do they actually serve some good purpose?
During this short lightning talk, we'll discuss the "aha's", "gotchas", and "oh, right .." factors of using a pie chart in your reports. Aimed at making the point that not all pie charts are bad, but they should be used correctly, this will be a fiery rant you don't want to miss.
Keeping the "Direct" in Power BI DirectQuery
When gathering requirements for your new model and reports, you've probably heard it before.. "We want to get real-time results from our source", or "We don't want to store data in Power BI, because it's already in the cloud". Maybe the stakeholders don't want to wait for that model refresh that will take way too long in their opinion. Who knows, plenty of arguments have been raised before, some with more sense than others. Having worked through my fair share of troubleshooting scenarios, I noticed some recurring themes when DirectQuery was at play.
Using a DirectQuery connectivity mode in Power BI allows you to achieve great results, if executed correctly. With less room for error and interpretation, a DirectQuery approach requires the different cogs in the chain to work well together, to ensure a smooth process.
During this session, we'll discuss some of the common patterns that make opting for a DirectQuery approach a valid scenario. Then, we'll make sure we cover the best practices around optimizing the data source, data model, and reports, so we can keep our query performance at an optimal level. Then, we'll discuss how advanced modelling techniques like Hybrid Tables, Composite Models, Aggregation Tables, and Auto Aggregations can make this scenario a bit more robust.
Last but not least, we'll also discuss the practical usage details you have to educate your end users on, as a report in DirectQuery will be more prone to reduced performance by end user interactions.
Asking questions to get meaningful answers
Have you ever asked a question, only to get a vague answer? Tired of hearing 'It Depends'? Or maybe you've even been frustrated that you're not getting the answer you were hoping for.
Asking questions is a very powerful tool, but can be bothersome to wield. By making sure we're supplying the other party with the compact information they need, and making sure we're "forcing" them to give a decent answer, there's a lot more information we can receive. By getting the information we need, we'll be able to proceed quicker, and getting to results faster.
It can be a tricky process, but with a few basic things to keep in mind, we can continually improve ourselves, for a better understanding.
How to write a Session Abstract, to get chosen to speak
When submitting a session abstract to a conference or user group, you've probably thought: "Ehhhh, I'll put some more details in that later". But then time goes by, and other things will have happened.
Writing an abstract is hard, especially for a session you'll probably only be presenting a few months down the road. But have you considered consciously on the words you're writing in there, and what kind of expectations they set to other parties?
Based on experience as a conference organiser, speaker and attendee, we'll dive into the key segments that should be included in your session abstract, to increase your chances of being selected. After all, if the selection committee understands what you'll be explaining, this can only be a good thing. Right?
Having a balanced abstract will also help attendees make the choice if your session is the right one for them and set the expectations correctly before they attend. And of course, you'll help yourself by clearly outlining the content you'll be discussing up front.
** These are based on personal experiences, and hence subjective. Attendees are free to use their own will and creativity to create their own set of handles.
Keeping up with your Power BI Tenant Administration
For many organisations, Power BI is becoming absolute key in their Information Delivery stream. As an important cog in the chain, it's the responsibility for that organisation to keep the cog well oiled.
Power BI Tenant Admins need to make sure they're on top of their game, to keep all parties involved on their good side. During this talk, we'll go through some of the key activities to optimise this process, based on my war stories as a consultant.
Security, Auditing, Monitoring and Alerting are the cornerstone of this process, which doesn't have to be hard. Walking out, you'll have some practical tips to take home with you.
Power BI Premium - Practical Tips for making the most of it
Your company is thinking about investing into Power BI Premium, or has it already taking the plunge? Quite common, I've encountered belief that all (Power BI related) worries will magically disappear when switching over to Premium Capacity. Out in the wild, the reality is often different. Power BI Premium Capacity requires some attention, but doing so will result in the ability to do great things!
During this talk, I'll touch base on some of the key activities your organisation needs to perform, to make sure your investment is one that pays off.
Using the Premium Capacity Metrics App as our base of operation, we'll expand into auditing, licensing, and some common dataset performance patterns, to make sure you're up to the task!
Troubleshooting Power BI Reports
So you've built a Power BI report with all the shinies? Good!
After some usage, users have reacted that it's a bit sluggish in usage, and they're not keen on using it anymore.
Starting off with the new Performance Analyzer Pane, we'll demonstrate how you can pinpoint bottlenecks in your report, and take actions on these insights.
With the myriad of root causes that are out there, an elimination approach might suit us very well.
Some of the main causes will be explained, and we'll get you going on how you can fix it for yourselves.
This talk will place emphasis on leveraging external free community tools to assess the state of the data model, and go further on fixing the groundworks of your solution.
Attendees should note that is not a DAX Deep Dive class, before wandering in :-)
Uncovering Secrets and Mysteries in your Power BI Tenant
Microsoft Power BI is all the rage, with enterprise and user adoption soaring year after year. New users get onboarded, reports,models, dashboards, dataflows, .. get created and tweaked every day. Whoa, things must be going great if we have so many things happening every day! Right?
And yet, as the usage skyrockets in our organisation, it's crucial to keep a solid oversight of the artefacts and activities generated by our colleagues. All too often, organisations have procedures, governance models, methodologies, and more in place, without having a structural way of making sure it actually works for the business processes. Enabling the business to make their decision on usage information for their artefacts will assist them in making informed decisions. Making sure this information gets to Data Stewards (workspace owners) is often left untouched, as there's no ready to go solution out there.
This session will focus on creating a Self-Service Power BI model and report, where users can follow up on activities performed on artefacts in their zone of control. We'll start with collecting the necessary information through various APIs, defining a model with Row-Level Security, and layer a Power BI Report on top of this.
Leveraging a solution like this, can drastically improve the general overview of the general usage of Power BI in the organisation, and make informed decisions for the BI Information Delivery flow.