Session
Analysing 4 billion rows of data using Power BI DirectLake and Fabric
Data volumes are going up and up. For most businesses, analysing more data at a faster pace becomes increasingly important to stay ahead.
In this session, I will demonstrate the use of Microsoft Fabric Lakehouses with a massive dataset containing 4 billion rows of data.
We will be looking at the performance implications of analysing such a huge dataset and showcasing Power BI's DirectLake capabilities to handle big data without copying.
After this session you will be able to:
1. Understand DirectLake vs Import & DirectQuery
2. Set up a Power BI DirectLake connection in Microsoft Fabric
3. Run analyses in real-time on datasets in the billions of rows, without breaking a sweat
First public delivery of this presentation, target audience: technical, data engineer, data analyst, preferred duration: 20-30 min
Bas Land
That Fabric Guy - Data Architect - Co-Founder
Woudenberg, The Netherlands
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top