Antony Catella
BI Consultant @ Adroit Data & Insight
Oxford, United Kingdom
Actions
I work as a BI Consultant for a consultancy based in the UK, developing Power BI reports and analytical solutions for multiple clients, primarily Non-Profit Organisations. Outside of work I co-organise the Devon and Cornwall Power BI User Group which is very rewarding and a great learning experience. I also achieved my DP-600 certification in 2025.
Area of Expertise
Topics
Using Field Parameters to allow users to customise their data analysis
This session is based on a real-life scenario of overcoming the challenge of developing a report to accomodate hundreds of users across multiple organisations and countries, in order for these users to audit their data. As a report developer it was paramount to accommodate the varied analytical needs of these users without overwhelming the report with a multitude of redundant pages and visuals.
We'll start the session looking at the report requirements before doing an introduction to field parameters, then how to edit field parameters to creating a data model using multiple field parameters. We will then move on to developing dynamic visual titles and text to give more context to the visuals. The session will conclude by combining field parameters and calculation groups to enable users to switch between 2 visual types.
This session is designed to show you how to make your reports not just fully dynamic but also intuitive and user-friendly through the incorporation of Field Parameters.
Target audience = Intermediate users
Session Duration = 50 - 60 mins
Delivered virtually to 4 user groups in UK, Denmark, Germany and Romania as well as in-person at DataGrillen
An introduction to analysing Geospatial data in Power BI
You may may have heard of 1 or more of these terms - geospatial data, vector, raster, shapefile, geoJSON, WKT, chloropeth map, heat map and many more.
What do all these geospatial terms mean?
In this session we will look at the essentials of geospatial data analysis within Power BI and aim to provide clarity to the common geospatial terms.
We will start with exploring the fundamentals of geospatial analysis and its application within business intelligence. We will learn about the different types of geospatial data formats and how to prepare that data for use in Power BI.
Now that we have prepared our data it is time to visualise it in Power BI using the native map visuals such as Azure Maps or ArcGIS as well as using custom visuals for enhanced analysis such as the Icon Map Pro. We will look at different types of maps, including backgrounds and reference layers.
This session will also cover some of the related geospatial tools to use such as mapshaper, QGIS and Mapbox Studio to help prepare and clean your data and create custom maps.
Session is designed for data analysts, business intelligence professionals and anyone interested in learning to analyse geospatial data in Power BI
Transforming a broken PBI report in the slow lane to cruising in the fast lane
"The pbix file is slow to import the data. Most of the visuals are not rendering in the report" - how do we fix this report?
This session is based on a real-life use case of re-developing a multi-page Power BI report from "slow and unusable" into a single page report that was " fast and nimble".
In this session we will walk through the process of transforming this report from changing how data was being imported in Power Query, to a revised semantic model and then making use of Field Parameters in the re-designed report page. Along the way we will cover 'best practices' and how implementing these "best practices" enabled the client to use the report with ease and in the process reduced the size of the pbix file by over 90%.
By the end of this session, you'll be equipped with practical techniques to enhance the performance and design of your Power BI reports.
This session is ideal for Power BI users, data analysts, and business intelligence professionals who want to improve the performance and efficiency of their Power BI reports
User Defined Formatting in Power BI
In general the formatting ( column colours, display units, data label formats ) in a Power BI report are fixed for the end user. How can we give end users the ability to format aspects of the visuals to suit their needs?
In this session, we look at making use of dynamic format strings and conditional formatting in Power BI, to allow end users to change the formatting of visuals. We begin with the basic concepts of dynamic format strings, which allow for the customisation of displayed data based on context and user interactions. Attendees will then learn how to apply these strings in reports to enhance the readability of data visuals.
Moving onto the use of conditional formatting rules and expressions to change the colours of data points within visuals. We also demonstrate combining dynamic format strings with conditional formatting to give end users significant flexibility in what to display in visuals.
By the end of this session, participants will have a comprehensive understanding of how to leverage these features and include them for their report end users.
Target audience = Intermediate Power BI users
Session Duration = 50 - 60 mins
Presented virtually to user groups in UK, Romania and Sweden
Future in-person and virtual sessions booked in for 2025
Previous session link https://www.youtube.com/watch?v=Ri1uVWwtLzc
Usage Metrics Report - build the historical data using Fabric
Who is using the report? How often is the report being opened? Which report pages are being viewed? Which report pages are not being viewed?
These were questions being asked by a client in relation to a key Power BI report. To do this we could make use of the Usage Metrics report within the Power BI service. The main issue being that the new Usage Metrics report only shows a rolling 30 days of data so the report would have limited value. To provide the insights required, we needed to build a history of the usage data beyond the last 30 days.
In this session, we will start with an overview of the default usage metrics report in the service. We will learn how to use OneLake Integration and Dataflows Gen2 to load data from the default usage metrics semantic model into a Warehouse on a daily basis, re-build the semantic model and develop a customised Usage Metrics Report showing data beyond the 30 day limit.
We will go through the steps required to implement such a solution as well as covering some of the challenges that people may encounter when doing so.
GeoAnalytics in Microsoft Fabric using KQL
Unlock the potential of geospatial data using Microsoft Fabric and Kusto Query Language (KQL). This session dives into the powerful geospatial capabilities of KQL and how it can be leveraged to query and analyze location-based data at scale.
Explore how to analyse geospatial datasets, run advanced spatial operations such as distance calculations and clustering using KQL, and then visualise your results in Power BI or KQL Querysets.
Through hands-on examples, we'll demonstrate writing KQL queries to perform geospatial joins, polygon intersections, and real-time data streaming, all within Microsoft Fabric. Whether you're a data engineer, analyst, or developer, this session will empower you to transform raw spatial data into actionable insights using KQL.
GeoAnalytics in Microsoft Fabric using Python Notebooks
Harness the power of Microsoft Fabric Python Notebooks for a seamless blend of data engineering, visualization, and advanced analytics for geospatial data.
In this session we learn how to ingest, process, and analyse geospatial datasets, leveraging libraries like ArcGIS GeoAnalytics, GeoPandas and Folium within Fabric Notebooks. Discover techniques to perform geospatial transformations, distance calculations, and spatial joins, all while visualizing your insights in interactive maps and charts.
We’ll walk through practical examples, from analyzing earthquake data to weather station data. Attendees will leave with the skills to analyse spatial data using a code-first approach.
Visualising geospatial data in Microsoft Fabric
The question may be asked, which experience is used to visualise your data within Fabric ? And the answer for many data analysts and engineers would be Power BI. Though that is definitely true there are other visualisation options to consider in Microsoft Fabric.
In this session we are going to cover the 4 tools that you can use to visualise data in Microsoft Fabric, with the main focus being on the visualisation of geospatial data.
We will explore the use of various Python libraries such as GeoPandas in Notebooks as well as demonstrating how to embed a Power BI report in a Notebook. We continue our journey with KQL Querysets, followed by creating map visuals in Real-Time Dashboards. Finally we will go over the mapping options ( both native and custom visuals ) that are available to developers in Power BI.
By the end of this session, attendees will have a good understanding of the various visualisation options available in Fabric.
Geospatial Python - GeoPandas for Beginners using Fabric Notebooks
Step into the world of GeoPandas, one of Python's most powerful geospatial libraries. In this session, we’ll introduce you to the fundamentals of working with spatial data using GeoPandas in Microsoft Fabric Notebooks, enabling you to analyse, visualise, and manipulate geospatial datasets with ease.
We will start with the basics of GeoDataFrames, spatial joins and geometry operations. We will then learn how to work with popular file formats like Shapefiles, GeoJSON and GeoParquet. Discover tools for performing spatial queries, intersections and creating stunning maps - all with just a few lines of Python code.
Whether you’re new to Python or just starting with geospatial data, this session will provide you with the tools and confidence to dive into spatial analysis using GeoPandas.
Usage Metrics Report - build the historical data using Power Automate
Who is using the report? How often is the report being opened? Which report pages are being viewed? Which report pages are not being viewed?
These were questions being asked by a client in relation to a key Power BI report. To do this we could make use of the Usage Metrics report within the Power BI service. The main issue being that the new Usage Metrics report only shows a rolling 30 days of data so the report would have limited value. To provide the insights required, we needed to build a history of the usage data beyond the last 30 days.
In this session, we will start with an overview of the default usage metrics report in the service. We will learn how to use Power Automate to query the default usage metrics semantic model and save the data to csv files on a daily basis. We will then use Power BI to import the data, re-build the semantic model and develop a customised Usage Metrics Report showing data beyond the 30 day limit.
We will go through the steps required to implement such a solution as well as covering some of the challenges that people may encounter when doing so.
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top