Rishi Sapra
Data Platform MVP | Data & Analytics Consultant, Speaker, Trainer and Technology evangelist specialising in Data Visualisation (Power BI) and Microsoft Fabric
Actions
Rishi is a Chartered Accountant and Microsoft MVP (technology influencer/speaker) with an Executive MBA (Hons) and a first class degree from the London School of Economics.
He currently leads Data & AI strategic projects at Avanade (Accenture-Microsoft JV), focusing on Microsoft Fabric solutions, governance frameworks, and enterprise migrations.
With 20+ years experience at Big 4 firms, Technology consulting and leading financial institutions (Deloitte, KPMG, HSBC, Barclays, Accenture), he combines deep financial expertise with cutting-edge data and AI capabilities to drive business transformation.
He runs an accelerator program - Power Platform Finance (www.powerplatformfinance.com) for finance professionals to learn Power BI through a combination of interactive e-learning and instructor-led training, in a combination that allows learners to be able to apply it effectively to their own data and processes.
Your Budget Is Still in Excel. Your Agents Don't Know That.
Dashboards tell you what happened. Budgets, targets and forecasts - the things that explain whether it matters - still live in spreadsheets, disconnected from your data platform and invisible to AI agents. Planning in Fabric IQ changes this: business intent written back to SQL tables that agents can query. With Row Level Security, audit trail and approval workflows built in. We build a governed planning layer and show ontologies and agents reasoning against actuals and targets in the same query.
What-if Scenario Modelling with Power BI and Power Apps
Power BI isn’t just about trends and patterns - you can also enter parameters which feed into your model and drive outputs under different scenarios. In this session Rishi will use 2.4 billion rows of NYC taxi data for scenario modelling with potential taxi drivers being able to model revenue/profits under trips with different locations, weather conditions and journey durations in order to work out the optimal taxi driving strategy!
We will build this model using a simple Power Apps form embedded into Power BI, supported by a basic workflow. This solution is made even more seamless by combining a Direct Query connection to the parameter table with an imported data model on the full dataset, utilising the Composite Model feature of Power BI.
The scenario modelling techniques you learn here can be applied to your own data and optimisation strategies and you will see how easy it is to take Power BI beyond just reporting!
Give Your Agents an actual job: Custom Skills with Fabric and Foundry IQ
Most teams build agents from scratch for every process- same problems, different prompts, different answers. You'd never hire someone without a job description and training on how you work. Agents need the same. Custom skills give them that: your Fabric data, your logic in SQL, your guardrails in structural form. Using Fabric IQ and Foundry IQ, we build a governed, reusable agent from a real business process end-to-end. Every skill compounds.Agents that know how you work, not just how AI works.
Making AI Smart: Custom Skills for Real-World Business Logic
Most teams deploy AI agents that deliver inconsistent answers to the same question. This isn't a fault in the AI itself - it's a lack of context about how the organization works. The solution? Custom skills.
In this session, we’ll walk through how to build a custom skill end-to-end, starting with extracting business logic from existing processes. We'll then structure this logic in Microsoft Fabric and expose it to agents via the Model Context Protocol (MCP), ensuring governed access and built-in guardrails.
Through this hands-on session, you’ll gain a reusable four-layer pattern - Data, Logic, Tools, Governance - that can be applied to any business process within your team. By the end, you’ll understand how to ensure your agents deliver consistent, relevant answers tailored to the unique needs of your organization, unlocking smarter, more context-aware AI interactions.
From Actuals to Intent: Designing AI Agents That Understand Business Goals
Your data platform tells AI what happened - but not what you intended to achieve. Budgets, targets, and forecasts are often trapped in disconnected spreadsheets, leaving AI agents to operate without an understanding of business goals.
In this session, we introduce the “intent gap” - the disconnect between historical data and business planning - and show how closing it transforms what AI can do. Using Fabric IQ, we demonstrate how planning data can be written back into governed SQL tables, making business intent accessible, structured, and queryable by AI agents.
We’ll build the full stack live: a planning layer, semantic model, ontology, and AI agent - and explore how each layer contributes to making intent machine-readable. You’ll see firsthand how agents behave differently when they can reason not just over actuals, but against targets, forecasts, and strategic goals.
By the end of this session, you’ll understand how to move from hindsight-driven analytics to intent-aware AI systems, and why giving AI access to your plans is the key to unlocking more meaningful, goal-driven outcomes.
Taming the Firehose: Technical Acceleration and ADHD
Fabric features every week, AI agents everywhere. DAX functions you've never heard of. The data platform world moves faster than any human can realistically keep up with—and for neurodivergent professionals, especially those with ADHD, this creates a unique paradox.
The same traits that make ADHD developers excellent at this work—pattern recognition across technologies, hyperfocus when solving complex problems, comfort with ambiguity, ability to context-switch—can become liabilities when the firehose never stops. Novelty-seeking becomes scattered learning. Hyperfocus becomes tunnel vision. Curiosity becomes analysis paralysis.
This 20-minute session shares practical, evidence-based strategies for thriving in high-velocity technical environments when your brain is wired differently.
You will learn:
- Recognizing the difference between "falling behind" (which is inevitable) and "becoming ineffective" (which is addressable)—and building a sustainable learning strategy
-Leveraging ADHD strengths (urgency response, lateral thinking, hyperfocus) through deliberate structure: public/client commitments, teaching, and deadline-driven projects
-Protecting deep work time when your brain craves constant novelty: environment design, dopamine management, and strategic use of time pressure
- Communicating your working style to managers and teams without oversharing or apologizing for neurodivergence
- Building external accountability systems (communities, user groups, blogging) that work with ADHD rather than against it
From Excel to Fabric with AI Agents: Automating Migration with MCP Servers
For decades, spreadsheets have powered the business world. From financial models to risk analysis, Excel's flexibility has made it the go-to tool for countless critical processes. But this has created a problem: valuable data and business logic locked away in files, maintained through manual, siloed workflows that are difficult to scale, govern, or share.
Microsoft Fabric promises to change this—bringing your data into a unified platform where it can be structured, secured, and reused across any tool. But there's a challenge: migrating complex Excel models to Power BI and Fabric is time-consuming, technically demanding, and risky. How do you preserve intricate business logic? How do you maintain control whilst automating the heavy lifting?
Enter the Model Context Protocol (MCP) - the breakthrough that connects AI agents to Power BI and Fabric. These open-standard servers enable AI assistants like Claude to automate your migration: reading Excel structures, generating Power Query transformations, designing and building lakehouses, notebooks, semantic models, DAX measures and visualisations. And validating the entire workflow through natural language conversation.
The AI does the grunt work while you maintain oversight at every critical decision point.
In this session, we'll take a credit risk financial model and migrate it to Fabric using AI agents powered by Power BI and Fabric MCP servers.
You'll see:
• How to configure MCP servers with Claude Desktop or VS Code for seamless Power BI/Fabric integration
• Using "context engineering" to guide AI agents through your specific business requirements and data structures
• Live automation of the migration workflow—from Excel analysis through Power Query generation, lakehouse/notebook and semantic model design/build (Including DAX measure creation)
• Implementing validation checkpoints and rollback strategies to maintain quality and control
• Practical patterns for accelerating future migrations while preserving business logic
Attendees will leave with configuration templates, starter prompts, and a framework for leveraging AI to transform weeks-long migration projects into hours—without sacrificing quality, control, or the deep business knowledge embedded in Excel models.
Power BI and Fabric Governance – Going from the skies to the ground!
Imagine yourself in the middle of trying to educate and convince senior stakeholders in a large organisation (e.g Head of IT Security, Head of IT, CDO, COO) about how Microsoft Fabric can be securely implemented and Governed. There are debates around how to balance the requirement for self-service with implementing sufficient technical controls to restrict access to potentially sensitive data. All in an organisation where Power BI has brought a lot of value to the business but a lot of pain for IT, essentially forming part of a growing, ungoverned domain of “Shadow IT”. So these stakeholders aren’t immediately sold on the idea of bringing more analytics items into the mix with Microsoft Fabric.
In this session you will be immersed into a series of role-play scenarios with these stakeholders, delivered through an interactive e-learning course. This is complete with a brainstorming exercise and interactive Q&A at the end of each module.
Through these scenarios you will learn clear patterns to deploy in order to “land the plane” and go from high level security requirements to ground operations.
This includes practical steps on how to implement technical controls, but also an understanding of how to organise the people on the ground – the airport staff and crew – so that there are clear roles and responsibilities, processes and ways of working.
The content is delivered through 6 modules:
Module 1: Overview. Understand the high level scope of what is required to effectively Govern and manage a Fabric/Power BI estate.
Module 2: Microsoft Fabric as part of a Data Mesh Architecture. Understand how the principles of Data Mesh apply to Microsoft Fabric and how it solves many of the challenges around data silos and dependencies on specialised technical teams.
Module 3: Administering and monitoring the Fabric environment:
- Manage Fabric capacities from a performance and consumption perspective
-Design feature controls using the Fabric Admin portal and
- know how to analyse Fabric adoption and usage!
Module 4: Data Governance
Understand the principles and key activities required to set up:
-Integration between MS Fabric and Purview for Data lineage /impact analysis, Data Cataloguing, and automatic scanning/identification of sensitive data
- Information Protection, including Data Loss Prevention Sensitivity Label and Data encryption.
- Data Safeguarding including Data residency, data recovery and end-to-end auditing as required to meet compliance requirements.
Module 5: Access control and Data Security
Define and implement security controls around access permissions, including:
-Data level security such as Row/Object level security and OneSecurity at the lakehouse/file level.
-Workspace and artefact security including workspace roles, sharing permissions and delegated data permissions.
-Identity and Network security including Gateways, private endpoints and Microsoft Entra Private Access.
Module 6: The Organisational Operating Model.
Brining it all together and applying the principles of the Fabric Adoption Roadmap to your organisation for end user and developer enablement.
This includes including ensuring the right data culture and setting up an Analytics/BI centre of excellence so that all users and admins have the appropriate level of support and the right environment for learning and continuous improvement!
Navigating the journey from Excel to Fabric (and back again!)
For decades, the entire business world has been built on the flexibility and power of spreadsheets. From rocket science to credit derivatives, a mind that could imagine it could implement it with nothing more than Excel formulas and cell references. But over time this has resulted in siloed, manual processes in "Shadow IT" where some of the most valuable data and logic in an organisation is locked away in these files.
But it doesn't need to be like that! The promise of Microsoft Fabric is to bring all your organisational data into a single place, in a single format where it can be extracted, shaped, enriched and secured once for use in any tool. So, from an organizational perspective, there are many benefits of such a unified data platform.
However, it’s not obvious how you would take an Excel-based process (or financial model) and bring it into a Power BI and Fabric environment. Perhaps it’s overkill to even consider such a move, and it’s potentially a significant business risk if those analysts used to working in Excel haven’t yet developed the different skillsets required for Fabric.
In this session we will take a financial credit risk model in Excel and go through the thought process and workflow for bringing this into Power BI/Fabric including:
- Considering how and why to use Power Query to combine and structure the data inputs into a star schema (which can be done in Excel!)
- Implementing the business logic with DAX, field parameters for increased flexibility and Translytical writeback to model scenarios directly from Power BI.
- Using visuals such as maps and dynamic scatter plots to tell a story with the data
- Breaking down the stages of data transformation/shaping into a medallion lakehouse structure in Fabric , with Row/Column level security at each layer
- Bringing the data at various layers back into Excel with this security model applied
- Querying the data model with Copilot and Data Agents to get AI powered insights
Attendees will come away with a view of how to drive “on the ground” business process transformation using Microsoft Fabric and ideas on how analysts used to Excel can think differently about the structure and re-usability of data, in order to take advantage of an enterprise level BI/Data platform.
From Strategy to Execution: Building AI-Ready Data Products with Microsoft Fabric
Most organizations struggle to move beyond data strategy slide decks. They invest in platforms, hire skilled teams, and establish governance frameworks—yet research shows 80% of these initiatives fail to deliver measurable business value. The problem isn't technology—it's the gap between strategic intent and operational reality.
This session tackles that gap head-on by demonstrating how to implement a modern data strategy using Microsoft Fabric, transforming abstract concepts into production-ready data products that business users actually adopt.
The Problem: Organizations face three critical challenges: (1) translating data strategy documents into actionable operating models, (2) balancing central governance with domain autonomy without creating bottlenecks, and (3) preparing data infrastructure that makes AI capabilities genuinely useful rather than just impressive demos.
The Solution: Microsoft Fabric's unified platform—combined with a structured approach to data products, domains, and personas—provides the technical foundation.
But success requires more: clear decision rights, reusable DevOps patterns, automated quality gates, and a Centre of Excellence that enables rather than restricts.
What You'll Learn: In 50 minutes, we'll walk through a practical framework covering:
• How to structure workspaces, domains, and capacities to operationalize "discipline at the core, flexibility at the edge"
• The six components that convert strategy into execution: leadership sponsorship, persona-based skills development, operating model design, medallion architecture patterns, governance automation, and value measurement
• Live demonstration of transforming a typical Excel-based financial process into a governed, AI-ready data product using Power Query, lakehouse layers, semantic models, and natural language queries via Copilot
• Integration with MCP (Model Context Protocol) servers to extend Power BI and Fabric capabilities into AI agent workflows
• Real-world patterns for balancing speed and control: data contracts, Great Expectations quality tests, and Git-based promotion pipelines
Designing and Implementing a Scalable Data Strategy with Microsoft Fabric
Organising, managing, securing and optimising data within an organization requires more than technology - it requires a strategy that involves people, processes and technology all working in harmony. Removing silos, implementing Governance and security mechanisms, enabling self-service and delivering timely, trusted insights to decision-makers.
As organizations continue to expand their data footprint across cloud applications, IoT systems, and operational databases, the need for a cohesive and scalable data strategy has never been more critical and never has it been more challenging to know even where to start.
This series of workshops presents a practical approach to designing and implementing a modern data strategy using Microsoft Fabric.
Anchored in real-world challenges derived from client case studies in Avanade (the largest Microsoft based consultancy globally), we explore how to move from fragmented, manual processes to an integrated, governed, and business-aligned data platform.
We’ll cover key architectural and operating model considerations—drawing from data mesh principles—to ensure that data is accessible, secure, and tailored to the needs of business users.
Key takeaways include:
- How to align data strategy with organizational goals using domain-driven design
- Establishing data governance and security using Microsoft Purview
- Integrating diverse data sources into a unified architecture
- Empowering business teams through curated, reusable data products and Power BI self-service
- Practical considerations for reducing technical debt and accelerating time-to-insight
Attendees will gain insights into how Microsoft Fabric components such as OneLake, Data Factory, Purview, and Power BI can be orchestrated to deliver enterprise-ready data products, enable self-service analytics, and support real-time decision-making at scale.
Whether you are a technical practitioner, architect, or business leader, this session will offer actionable guidance to help you unlock the full potential of your data estate using Microsoft Fabric.
Data Storytelling in Power BI: From Reports to Narratives
BI developers often hesitate to use data visualization as storytelling. Common questions arise: "How can I tell a story for another part of the business when I don't have domain expertise?", "How do I create narratives when the data changes daily?", "Isn't my job just to present facts, not opinions?"
These are excuses for not building the best visualizations we're capable of. Data storytelling isn't about making judgements - it's about understanding your user personas, making them the central characters, and presenting data as arguments (not just statements) that resolve the tension (around business performance!) that every good story needs.
This session demonstrates how to transform standard Power BI reports into compelling data narratives that drive action.
In live demos, we'll deconstruct real-world Power BI reports built for the community, showing how storytelling techniques transform them from data dumps into narratives. You'll see practical application of pre-attentive attributes (colour, size, position), argumentative data presentation structures, and "so what" conclusions that drive user action.
You will learn:
- The core structure of compelling stories and how this translates directly to Power BI report design and information architecture
- Using pre-attentive attributes strategically to guide viewers through your narrative without overwhelming them with visual noise
-Framing data as arguments rather than statements -presenting insights that lead viewers to conclusions instead of leaving interpretation ambiguous
-Building report structures around user personas to make them the "central characters" and their business challenges the "tension" to be resolved
-Creating "so what" conclusions in reports that provide clear calls-to-action based on the data story presented
Perfect for Power BI developers, report designers, and BI professionals who want their reports to drive action rather than just present information.
Building a Financial Income Statement Report in Power BI
In this session Rishi will show how to build an Income Statement in Power BI end-to-end, looking at:
- Capturing requirements for the report and scoping/designing a data model with scattered/inconsistent transaction data
- Use of a "FS Lines" table to have metadata driven financial statements including custom calculations
- Creating Dataflows to ingest data from an on-premise database/files/Web API (for exchange Rates)
- Shaping the data in Power BI Desktop into a General Ledger/Trial Balance with corresponding dimensions and currency conversion
- Creating base measures for Revenue/Profit and corresponding ratios including time-based variations (MTD/QTD/YTD) and Current vs Prior Year calculations
- Defining DAX measures for Running Totals, Category Totals and Custom Calculations
- Applying different number formats for viewing the income statement in reporting currency (USD) vs local currencies (GBP/AUD/EUR)
- Applying conditional formatting for the Income Statement matrix to highlight sub-totals/custom calculation lines differently.
Building a credit rating model in Power BI
This session will explore how to transform a traditional Excel-based credit rating model for countries into a modern, interactive, and scalable solution using Power BI and Microsoft Fabric. Participants will discover innovative techniques to enhance financial modeling and storytelling by leveraging advanced analytics and integration capabilities.
Hear from Power Platform in Finance specialist and Microsoft MVP, Rishi Sapra, as we delve into the following areas:
• Replicating Complex Rating Logic: Translating intricate credit rating formulas and methodologies from Excel into DAX to ensure consistency and accuracy.
• Interactive Insights: Using slicers, bookmarks, and contextual visualizations in Power BI to enable dynamic comparisons and highlight key drivers of credit ratings.
• Streamlined Reporting: Designing intuitive and user-friendly reports that empower stakeholders to explore insights interactively.
• Scalability with Microsoft Fabric: Harnessing the Medallion Lakehouse architecture to enable seamless data integration, metadata management, and collaboration at an enterprise level.
This session is particularly beneficial for finance professionals and analysts who are seeking to modernize their financial or operational models. Attendees will leave with practical strategies to maintain model accuracy, enhance interactivity, and scale their solutions to meet the evolving demands of modern analytics.
AI-Enabled Financial Analytics with Microsoft Fabric
Finance data across ERPs and Excel makes reporting slow and prevents AI insights. Build a complete financial analytics solution using Copilot throughout: data exploration, semantic modeling, transformation with Notebooks/Dataflows, DAX measures, Smart Narratives, and Fabric Data Agents.
Work with realistic P&L data including multi-currency and eliminations. Leave with a working Income Statement solution and reusable templates.
From Strategy to Execution: Building AI-Ready Data Products with Microsoft Fabric (TRAINING DAY)
Most organizations struggle to move beyond data strategy slide decks. They invest in platforms, hire skilled teams, and establish governance frameworks—yet research shows 80% of these initiatives fail to deliver measurable business value. The problem isn't technology—it's the gap between strategic intent and operational reality.
This hands-on training day tackles that gap by demonstrating how to implement a modern data strategy using Microsoft Fabric, transforming abstract concepts like "data products," "domains," and "data mesh" into production-ready solutions that business users actually adopt.
Organizations face three critical challenges: translating data strategy documents into actionable operating models, balancing central governance with domain autonomy without creating bottlenecks, and preparing data infrastructure that makes AI capabilities genuinely useful rather than just impressive demos. Microsoft Fabric's unified platform—combined with a structured approach to data products, domains, and personas—provides the technical foundation. But success requires more: clear decision rights, reusable DevOps patterns, automated quality gates, and a Centre of Excellence that enables rather than restricts.
This training day walks through the complete implementation framework: structuring workspaces, domains, and capacities to operationalize "discipline at the core, flexibility at the edge"; defining personas, decision rights, and operating models; building medallion architecture patterns with automated quality validation; enriching data models with metadata for AI readiness; and establishing DevOps workflows for controlled promotion and deployment.
Full Day Agenda
**Morning Session (09:00 - 12:30)
*Module 1: Data Strategy Foundations and Fabric Architecture (09:00 - 10:30)
-Why data strategies fail: Common pitfalls and the strategy-to-execution gap
-The six components of effective data strategy: Leadership, people, process, technology, governance, measurement
-Fabric architecture overview: Workspaces, domains, capacities, and OneLake
-Operating models: Centralized, decentralized, and blended approaches
Hands-on Lab: Fabric environment setup: Provisioning Fabric capacity and understanding licensing models; Creating workspace structure aligned to organizational domains; Configuring domain hierarchies and ownership models; Setting up role-based access control (RBAC) for different personas
-Introduction to personas: Data consumers, explorers, analysts, analytics engineers, data engineers, administrators
* Coffee break (10:30 - 10:45)
* Module 2: Medallion Architecture and Data Products (10:45 - 12:30)
-From data assets to data products: What makes a data product
Medallion architecture: Bronze, silver, gold, and platinum layers
Hands-on Lab: Building bronze layer with raw data ingestion
-Creating lakehouse for raw data landing
-Implementing data pipelines for incremental extraction
- Handling schema drift and source system changes
- Establishing data lineage with Purview integration
Hands-on Lab: Silver layer with business rules and quality gates:
Implementing Great Expectations for data quality validation/creating automated quality tests/schema validation, business rule checks, referential integrity; Building Data Activator alerts for quality threshold breaches; Documenting data contracts: schema definitions, SLAs, quality rules Understanding data product ownership and accountability
*Lunch (12:30 - 13:30)
**Afternoon Session (13:30 - 17:00)
*Module 3: Gold Layer and Semantic Models for AI (13:30 - 15:00)
- Gold layer design: Consumption-optimized data products
Hands-on Lab: Creating gold layer with dimensional modeling
- Designing star schemas for analytics consumption
Implementing incremental processing and performance optimization
- Applying row-level and column-level security
- Creating shortcuts for cross-domain data sharing without duplication
-Hands-on Lab: Building AI-ready semantic models: Creating semantic models on gold layer data; Enriching with metadata: descriptions, synonyms, display folders, data categories;
Configuring AI instructions/metadata for Copilot; Building calculation groups for time intelligence; Testing Copilot integration for natural language insights
-Hands-on Lab: MCP server integration: Configuring MCP servers for programmatic AI access; Building custom AI agents that query governed data products; Validating AI interpretation of business metrics
*Coffee break (15:00 - 15:15)
*Module 4: DevOps, Governance, and Centre of Excellence (15:15 - 16:30)
-DevOps for Fabric: Version control, CI/CD, and deployment automation
-Hands-on Lab: Git-based deployment pipelines: Configuring Git integration for Fabric workspaces; Creating Azure DevOps pipelines for automated deployment; Implementing deployment stages: development, test, production; Building automated testing for data quality and semantic model validation
Implementing rollback strategies for failed deployments
-Hands-on Lab: Governance automation Configuring Microsoft Purview for data cataloging and lineage; Applying sensitivity labels and information protection policies; Implementing data loss prevention (DLP) rules; Creating compliance dashboards and audit reports
-Centre of Excellence patterns: Capability management, solution design, shared assets, community enablement
-Hands-on Lab: Creating reusable templates and patterns: Building workspace templates (defined in markdown/JSON) with pre-configured elements required for security/governance; Creating pipeline templates for common ingestion patterns; Establishing documentation standards and wiki templates
- Module 5: Measurement, Value Demonstration, and Continuous Improvement (16:30 - 17:30)
- Measuring success: Adoption metrics, data quality trends, business impact
- Live Demo: Building executive dashboards for data strategy ROI
- Tracking workspace usage and adoption by persona
- Monitoring data quality trends and cost of quality incidents
- Measuring time-to-delivery for new data products
- Calculating ROI: time saved, decisions accelerated, manual processes eliminated
-Operating model maturity assessment framework
- Common pitfalls and lessons learned from real-world implementations
- Scaling patterns: From pilot to enterprise rollout
From Manual Month-End to AI-Powered Finance in Fabric
Finance teams work with the most sensitive data in any organization, yet they're often stuck in manual, spreadsheet-driven processes.
The challenge isn't just technology adoption—it's about navigating steep learning curves, maintaining governance and control, overcoming IT dependencies, and translating complex finance logic that lives in people's heads or Excel files into scalable, secure solutions.
Without a clear strategy, attempting to modernize these manual processes risks creating more technical debt rather than solving the underlying problems.
Microsoft Fabric and Copilot offer a path forward—low/no-code tools that finance teams can own and operate, with AI assistance to accelerate development while maintaining full control.
Recent innovations like Model Context Protocol (MCP) servers enable AI agents to help not just design but to actually build and maintain solutions through natural language, dramatically reducing implementation time while preserving governance and quality standards.
This hands-on training day walks through building a complete month-end reporting solution from scratch. You'll learn how to securely connect to ERP systems and spreadsheets, use AI-assisted notebooks and Copilot to consolidate and transform transactional data, implement medallion architecture (bronze/silver/gold layers) with appropriate security at each stage, enrich data models with the metadata and business context that makes AI effective, and integrate Copilot and Data Agents for natural language financial insights and commentary.
Full Day Agenda
** Morning Session (09:00 - 12:30) **
- Module 1: Environment Setup and Data Connectivity (09:00 - 10:30)
- Understanding the finance reporting challenge: Why manual processes persist and what's required to replace them
Fabric workspace provisioning and capacity allocation for financial workloads
- Hands-on Lab: Connecting to ERP systems with proper credential management, setting up secure connections to sample financial data sources; configuring authentication and managing connection strings; Initial data profiling to understand source data structures
-Coffee break (10:30 - 10:45)
- Hands-on Lab: Establishing data governance from day one: Applying sensitivity labels to workspaces and datasets, Configuring Purview policies for financial data, Setting up audit logging for compliance requirements
-Module 2: Building the Silver Layer with AI Assistance (10:45 - 12:30)
- Medallion architecture overview: Bronze, silver, and gold layers for financial data
- Introduction to Fabric notebooks and AI-assisted development
- Hands-on Lab: Creating bronze lakehouse with raw data ingestion; Using Data Factory pipelines for incremental extraction; Handling schema drift and source system changes; Implementing basic error handling and retry logic
- Hands-on Lab: MCP server setup and configuration: Installing Claude Desktop and configuring Fabric MCP server; Authentication and workspace connection; Testing basic operations through natural language commands
- Introduction to using AI agents for notebook development
-Lunch (12:30 - 13:30)
** Afternoon Session (13:30 - 17:00) **
- Module 3: Data Transformation and Silver/Gold Layers (13:30 - 15:00)
- Understanding financial data transformation requirements
Hands-on Lab: Building the silver layer with business rules
- Using AI-assisted notebooks to consolidate multi-entity transactional data
-Implementing data quality checks with a framework (e.g. Great Expectations)
- Applying business logic transformations for financial calculations
-Hands-on Lab: Creating the gold layer with dimensional modeling; Designing star schemas for financial reporting
Building dimensions (time, account, cost center, entity); Creating fact tables with appropriate grain; Implementing row-level security for multi-entity access control
- Coffee break (15:00 - 15:15)
-Module 4: Semantic Models and AI Integration (15:15 - 16:30)
- Designing semantic models for both traditional BI and AI consumption
- Hands-on Lab: Building and enriching semantic models: Creating DAX measures for key financial metrics (gross margin, variance analysis); Using MCP servers to accelerate measure development through natural language; Enriching models with metadata: descriptions, synonyms, display folders; Configuring AI instructions and other metadata for financial terminology; Building calculation groups for time intelligence (YTD, QTD, Prior Year)
- Hands-on Lab: Copilot and Data Agent integration: Testing Copilot for natural language queries on financial data; Building Data Agent skills for variance commentary and insights; Creating conversational interfaces for board pack preparation; Validating AI-generated insights against known calculations
Module 5: Production Deployment and Best Practices (16:30 - 17:30)
- Security considerations: Row-level security, column masking, audit trails
-Performance optimization: Incremental refresh strategies, aggregations, query optimization
- Governance and compliance: Purview integration, lineage tracking, sensitivity labeling
- Deployment roadmap: Moving from development to production
- Monitoring and maintenance: Capacity metrics, pipeline health, data quality alerts
- Common pitfalls and troubleshooting strategies
- Resources and community support for continued learning
- Open Q&A
What attendees will take home:
Working Fabric workspace with complete month-end reporting solution
MCP server configuration templates and starter prompts for financial scenarios
Semantic model templates with financial calculation patterns
Power BI report templates for month-end board packs
Deployment checklists and governance frameworks
GitHub repository with all lab materials, sample data, and documentation
By the end of this training day, you'll have practical experience building AI-powered financial reporting in Fabric—with a clear implementation path that can be applied to your organization's specific requirements without requiring specialized data engineering skills.
Target audience: Data engineers, Power BI developers, database administrators, and BI professionals supporting finance departments or building analytical solutions for sensitive data environments.
Transform your finance function witn Power BI, Fabric, ChatGPT, Copilot and AI Skills!
There is no doubt that the Microsoft data analytics and Gen AI platforms can help you re-imagine processes and re-invent an entire business function like Finance. But how? Whilst an interactive Power BI report (replacing the manual month end board pack for example) is an essential part of a solution, it is not likely to be enough by itself, especially in the age of stakeholders expecting Gen AI!
Instead you need to go through a methodical approach of understanding your data, capturing stakeholder requirements (functional and non functional!), using these to design a star schema semantic model, developing data engineering processes to shape and clean data, enriching your model with measures and metadata and then designing prompt engineering to train Copilot/a custom RAG model on your data!
That's a lot of work! Fortunately, Copilot and ChatGPT can help with every single step once you're working in the AI Powered platform of Microsoft Fabric!
In this session Rishi will show an example of Income Statement reporting being taken through this process, all with minimal code and just enough effort to make this an example use case that can be completed in days and weeks rather than months and years.
European Microsoft Fabric Community Conference 2025 Sessionize Event
Power BI & Fabric Summit 2025 Sessionize Event
Month of Copilots Sessionize Event
SQLBits 2024 - General Sessions Sessionize Event
#DataWeekender v4.2 Sessionize Event
South Coast Summit 2021 Sessionize Event
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top