Rishi Sapra
Data Platform MVP | Data & Analytics Consultant, Speaker, Trainer and Technology evangelist specialising in Data Visualisation (Power BI) and Microsoft Fabric
Actions
Rishi is a Chartered Accountant and Microsoft MVP (technology influencer/speaker) with an Executive MBA (Hons) and a first class degree from the London School of Economics.
He currently leads Data & AI strategic projects at Avanade (Accenture-Microsoft JV), focusing on Microsoft Fabric solutions, governance frameworks, and enterprise migrations.
With 20+ years experience at Big 4 firms, Technology consulting and leading financial institutions (Deloitte, KPMG, HSBC, Barclays, Accenture), he combines deep financial expertise with cutting-edge data and AI capabilities to drive business transformation.
He runs an accelerator program - Power Platform Finance (www.powerplatformfinance.com) for finance professionals to learn Power BI through a combination of interactive e-learning and instructor-led training, in a combination that allows learners to be able to apply it effectively to their own data and processes.
From SAP Extracts to AI-Powered Insights (TRAINING DAY)
Finance teams face a persistent challenge: critical data remains locked in SAP while traditional approaches require weeks of IT involvement to extract data, build pipelines, and develop reports. This results in stale information that hampers decision-making and prevents deployment of AI-powered analytics.
This session demonstrates how Microsoft's Business Process Solutions transforms this paradigm by automatically deploying complete data infrastructure for modern financial reporting.
Attendees will experience Business Process Solutions connecting SAP to Microsoft Fabric, automatically creating lakehouses, data pipelines, and production-ready financial reports without manual development.
The session then explores how AI assistance through the Power BI MCP Server enables finance professionals to customize pre-built solutions to match organizational requirements. We'll demonstrate adapting hierarchical structures to tabular formats, incorporating reporting standards like IAS 1, and building custom calculations—compressing weeks of specialized development into hours.
Finally, we'll demonstrate deploying Fabric Data Agents into Microsoft Teams for conversational access to live SAP data, and integrating Finance Agents for Excel and custom agents built in Copilot Studio.
Participants will leave with a practical roadmap for transitioning from static, extract-based reporting to live, AI-augmented financial intelligence.
Full Day Agenda
Morning Session (09:00 - 12:30)
Module 1: Introduction and Environment Setup (09:00 - 10:30)
- The SAP data challenge: Why finance teams struggle with locked data and week-long IT dependencies
- Overview of Microsoft Business Process Solutions for SAP
-Hands-on Lab: Setting up SAP S/4HANA trial instance using SAP Cloud Appliance Library (cal.sap.com); Creating your SAP account Understanding key financial tables
Coffee break (10:30 - 10:45)
Module 2: Deploying Business Process Solutions (10:45 - 12:30)
-Architecture overview: How Business Process Solutions automates infrastructure deployment; Live Demo: Connecting Business Process Solutions to your SAP instance
-Hands-on Lab: Automated deployment of lakehouses (bronze, silver, gold layers), Data pipelines for SAP extraction, Pre-built financial semantic models, Production-ready Power BI reports (e.g. Financial Reporting, Order to Cash); Understanding what gets created automatically vs. what requires customization
- Lunch (12:30 - 13:30)
** Afternoon Session (13:30 - 17:00) **
Module 3: AI-Assisted Customization with MCP Servers (13:30 - 15:00)
- The customization challenge: Adapting pre-built solutions to your organization
- Introduction to Model Context Protocol (MCP) servers for Power BI and Fabric
- Hands-on Lab: Installing and configuring Claude Desktop with Power BI MCP server
- Hands-on Lab: Using AI agents to customize SAP financial reports: Converting hierarchical GL structures to tabular format for your chart of accounts; Incorporating IAS 1 presentation requirements; Building custom DAX measures for margin analysis and variance analysis; Creating calculation groups for time intelligence; Context engineering: Guiding AI to understand your specific requirements
- Coffee break (15:00 - 15:15)
Module 4: Deploying Conversational AI Agents (15:15 - 16:30)
- Making SAP data accessible through natural language
- Hands-on Lab: Deploying Fabric Data Agents into Microsoft Teams: Creating a Data Agent skill on top of your SAP financial semantic model and lakehouse tables with few shot examples and AI instructions (e.g. on how to use the model vs the lakehouse); Testing conversational queries ("What's our gross margin this quarter?"); Handling follow-up questions and context
- Live Demo: Integrating Finance Agents for Excel/Copilot Studio : A look at how the SAP data can be leveraged in Finance Agents for Excel (Reconciliation, Financial Variance Analysis agent etc) as well as a source for Copilot Studio Agents; Integrating approval workflows
Module 5: Q&A, Best Practices, and Next Steps (16:30 - 17:30)
- Security and governance considerations for SAP data in Fabric
- Performance optimization: Incremental refresh, aggregations
- Roadmap: From proof-of-concept to production deployment
-Common pitfalls and how to avoid them
- Resources: Documentation, community support
-Q&A
From Strategy to Execution: Building AI-Ready Data Products with Microsoft Fabric (TRAINING DAY)
Most organizations struggle to move beyond data strategy slide decks. They invest in platforms, hire skilled teams, and establish governance frameworks—yet research shows 80% of these initiatives fail to deliver measurable business value. The problem isn't technology—it's the gap between strategic intent and operational reality.
This hands-on training day tackles that gap by demonstrating how to implement a modern data strategy using Microsoft Fabric, transforming abstract concepts like "data products," "domains," and "data mesh" into production-ready solutions that business users actually adopt.
Organizations face three critical challenges: translating data strategy documents into actionable operating models, balancing central governance with domain autonomy without creating bottlenecks, and preparing data infrastructure that makes AI capabilities genuinely useful rather than just impressive demos. Microsoft Fabric's unified platform—combined with a structured approach to data products, domains, and personas—provides the technical foundation. But success requires more: clear decision rights, reusable DevOps patterns, automated quality gates, and a Centre of Excellence that enables rather than restricts.
This training day walks through the complete implementation framework: structuring workspaces, domains, and capacities to operationalize "discipline at the core, flexibility at the edge"; defining personas, decision rights, and operating models; building medallion architecture patterns with automated quality validation; enriching data models with metadata for AI readiness; and establishing DevOps workflows for controlled promotion and deployment.
Full Day Agenda
**Morning Session (09:00 - 12:30)
*Module 1: Data Strategy Foundations and Fabric Architecture (09:00 - 10:30)
-Why data strategies fail: Common pitfalls and the strategy-to-execution gap
-The six components of effective data strategy: Leadership, people, process, technology, governance, measurement
-Fabric architecture overview: Workspaces, domains, capacities, and OneLake
-Operating models: Centralized, decentralized, and blended approaches
Hands-on Lab: Fabric environment setup: Provisioning Fabric capacity and understanding licensing models; Creating workspace structure aligned to organizational domains; Configuring domain hierarchies and ownership models; Setting up role-based access control (RBAC) for different personas
-Introduction to personas: Data consumers, explorers, analysts, analytics engineers, data engineers, administrators
* Coffee break (10:30 - 10:45)
* Module 2: Medallion Architecture and Data Products (10:45 - 12:30)
-From data assets to data products: What makes a data product
Medallion architecture: Bronze, silver, gold, and platinum layers
Hands-on Lab: Building bronze layer with raw data ingestion
-Creating lakehouse for raw data landing
-Implementing data pipelines for incremental extraction
- Handling schema drift and source system changes
- Establishing data lineage with Purview integration
Hands-on Lab: Silver layer with business rules and quality gates:
Implementing Great Expectations for data quality validation/creating automated quality tests/schema validation, business rule checks, referential integrity; Building Data Activator alerts for quality threshold breaches; Documenting data contracts: schema definitions, SLAs, quality rules Understanding data product ownership and accountability
*Lunch (12:30 - 13:30)
**Afternoon Session (13:30 - 17:00)
*Module 3: Gold Layer and Semantic Models for AI (13:30 - 15:00)
- Gold layer design: Consumption-optimized data products
Hands-on Lab: Creating gold layer with dimensional modeling
- Designing star schemas for analytics consumption
Implementing incremental processing and performance optimization
- Applying row-level and column-level security
- Creating shortcuts for cross-domain data sharing without duplication
-Hands-on Lab: Building AI-ready semantic models: Creating semantic models on gold layer data; Enriching with metadata: descriptions, synonyms, display folders, data categories;
Configuring AI instructions/metadata for Copilot; Building calculation groups for time intelligence; Testing Copilot integration for natural language insights
-Hands-on Lab: MCP server integration: Configuring MCP servers for programmatic AI access; Building custom AI agents that query governed data products; Validating AI interpretation of business metrics
*Coffee break (15:00 - 15:15)
*Module 4: DevOps, Governance, and Centre of Excellence (15:15 - 16:30)
-DevOps for Fabric: Version control, CI/CD, and deployment automation
-Hands-on Lab: Git-based deployment pipelines: Configuring Git integration for Fabric workspaces; Creating Azure DevOps pipelines for automated deployment; Implementing deployment stages: development, test, production; Building automated testing for data quality and semantic model validation
Implementing rollback strategies for failed deployments
-Hands-on Lab: Governance automation Configuring Microsoft Purview for data cataloging and lineage; Applying sensitivity labels and information protection policies; Implementing data loss prevention (DLP) rules; Creating compliance dashboards and audit reports
-Centre of Excellence patterns: Capability management, solution design, shared assets, community enablement
-Hands-on Lab: Creating reusable templates and patterns: Building workspace templates (defined in markdown/JSON) with pre-configured elements required for security/governance; Creating pipeline templates for common ingestion patterns; Establishing documentation standards and wiki templates
- Module 5: Measurement, Value Demonstration, and Continuous Improvement (16:30 - 17:30)
- Measuring success: Adoption metrics, data quality trends, business impact
- Live Demo: Building executive dashboards for data strategy ROI
- Tracking workspace usage and adoption by persona
- Monitoring data quality trends and cost of quality incidents
- Measuring time-to-delivery for new data products
- Calculating ROI: time saved, decisions accelerated, manual processes eliminated
-Operating model maturity assessment framework
- Common pitfalls and lessons learned from real-world implementations
- Scaling patterns: From pilot to enterprise rollout
Navigating the journey from Excel to Fabric (and back again!)
For decades, the entire business world has been built on the flexibility and power of spreadsheets. From rocket science to credit derivatives, a mind that could imagine it could implement it with nothing more than Excel formulas and cell references. But over time this has resulted in siloed, manual processes in "Shadow IT" where some of the most valuable data and logic in an organisation is locked away in these files.
But it doesn't need to be like that! The promise of Microsoft Fabric is to bring all your organisational data into a single place, in a single format where it can be extracted, shaped, enriched and secured once for use in any tool. So, from an organizational perspective, there are many benefits of such a unified data platform.
However, it’s not obvious how you would take an Excel-based process (or financial model) and bring it into a Power BI and Fabric environment. Perhaps it’s overkill to even consider such a move, and it’s potentially a significant business risk if those analysts used to working in Excel haven’t yet developed the different skillsets required for Fabric.
In this session we will take a financial credit risk model in Excel and go through the thought process and workflow for bringing this into Power BI/Fabric including:
- Considering how and why to use Power Query to combine and structure the data inputs into a star schema (which can be done in Excel!)
- Implementing the business logic with DAX, field parameters for increased flexibility and Translytical writeback to model scenarios directly from Power BI.
- Using visuals such as maps and dynamic scatter plots to tell a story with the data
- Breaking down the stages of data transformation/shaping into a medallion lakehouse structure in Fabric , with Row/Column level security at each layer
- Bringing the data at various layers back into Excel with this security model applied
- Querying the data model with Copilot and Data Agents to get AI powered insights
Attendees will come away with a view of how to drive “on the ground” business process transformation using Microsoft Fabric and ideas on how analysts used to Excel can think differently about the structure and re-usability of data, in order to take advantage of an enterprise level BI/Data platform.
From Manual Month-End to AI-Powered Finance in Fabric
Finance teams work with the most sensitive data in any organization, yet they're often stuck in manual, spreadsheet-driven processes.
The challenge isn't just technology adoption—it's about navigating steep learning curves, maintaining governance and control, overcoming IT dependencies, and translating complex finance logic that lives in people's heads or Excel files into scalable, secure solutions.
Without a clear strategy, attempting to modernize these manual processes risks creating more technical debt rather than solving the underlying problems.
Microsoft Fabric and Copilot offer a path forward—low/no-code tools that finance teams can own and operate, with AI assistance to accelerate development while maintaining full control.
Recent innovations like Model Context Protocol (MCP) servers enable AI agents to help not just design but to actually build and maintain solutions through natural language, dramatically reducing implementation time while preserving governance and quality standards.
This hands-on training day walks through building a complete month-end reporting solution from scratch. You'll learn how to securely connect to ERP systems and spreadsheets, use AI-assisted notebooks and Copilot to consolidate and transform transactional data, implement medallion architecture (bronze/silver/gold layers) with appropriate security at each stage, enrich data models with the metadata and business context that makes AI effective, and integrate Copilot and Data Agents for natural language financial insights and commentary.
Full Day Agenda
** Morning Session (09:00 - 12:30) **
- Module 1: Environment Setup and Data Connectivity (09:00 - 10:30)
- Understanding the finance reporting challenge: Why manual processes persist and what's required to replace them
Fabric workspace provisioning and capacity allocation for financial workloads
- Hands-on Lab: Connecting to ERP systems with proper credential management, setting up secure connections to sample financial data sources; configuring authentication and managing connection strings; Initial data profiling to understand source data structures
-Coffee break (10:30 - 10:45)
- Hands-on Lab: Establishing data governance from day one: Applying sensitivity labels to workspaces and datasets, Configuring Purview policies for financial data, Setting up audit logging for compliance requirements
-Module 2: Building the Silver Layer with AI Assistance (10:45 - 12:30)
- Medallion architecture overview: Bronze, silver, and gold layers for financial data
- Introduction to Fabric notebooks and AI-assisted development
- Hands-on Lab: Creating bronze lakehouse with raw data ingestion; Using Data Factory pipelines for incremental extraction; Handling schema drift and source system changes; Implementing basic error handling and retry logic
- Hands-on Lab: MCP server setup and configuration: Installing Claude Desktop and configuring Fabric MCP server; Authentication and workspace connection; Testing basic operations through natural language commands
- Introduction to using AI agents for notebook development
-Lunch (12:30 - 13:30)
** Afternoon Session (13:30 - 17:00) **
- Module 3: Data Transformation and Silver/Gold Layers (13:30 - 15:00)
- Understanding financial data transformation requirements
Hands-on Lab: Building the silver layer with business rules
- Using AI-assisted notebooks to consolidate multi-entity transactional data
-Implementing data quality checks with a framework (e.g. Great Expectations)
- Applying business logic transformations for financial calculations
-Hands-on Lab: Creating the gold layer with dimensional modeling; Designing star schemas for financial reporting
Building dimensions (time, account, cost center, entity); Creating fact tables with appropriate grain; Implementing row-level security for multi-entity access control
- Coffee break (15:00 - 15:15)
-Module 4: Semantic Models and AI Integration (15:15 - 16:30)
- Designing semantic models for both traditional BI and AI consumption
- Hands-on Lab: Building and enriching semantic models: Creating DAX measures for key financial metrics (gross margin, variance analysis); Using MCP servers to accelerate measure development through natural language; Enriching models with metadata: descriptions, synonyms, display folders; Configuring AI instructions and other metadata for financial terminology; Building calculation groups for time intelligence (YTD, QTD, Prior Year)
- Hands-on Lab: Copilot and Data Agent integration: Testing Copilot for natural language queries on financial data; Building Data Agent skills for variance commentary and insights; Creating conversational interfaces for board pack preparation; Validating AI-generated insights against known calculations
Module 5: Production Deployment and Best Practices (16:30 - 17:30)
- Security considerations: Row-level security, column masking, audit trails
-Performance optimization: Incremental refresh strategies, aggregations, query optimization
- Governance and compliance: Purview integration, lineage tracking, sensitivity labeling
- Deployment roadmap: Moving from development to production
- Monitoring and maintenance: Capacity metrics, pipeline health, data quality alerts
- Common pitfalls and troubleshooting strategies
- Resources and community support for continued learning
- Open Q&A
What attendees will take home:
Working Fabric workspace with complete month-end reporting solution
MCP server configuration templates and starter prompts for financial scenarios
Semantic model templates with financial calculation patterns
Power BI report templates for month-end board packs
Deployment checklists and governance frameworks
GitHub repository with all lab materials, sample data, and documentation
By the end of this training day, you'll have practical experience building AI-powered financial reporting in Fabric—with a clear implementation path that can be applied to your organization's specific requirements without requiring specialized data engineering skills.
Target audience: Data engineers, Power BI developers, database administrators, and BI professionals supporting finance departments or building analytical solutions for sensitive data environments.
From Excel to Fabric with AI Agents: Automating Migration with MCP Servers (Training Day)
For decades, spreadsheets have powered the business world. From financial models to risk analysis, Excel's flexibility has made it the go-to tool for countless critical processes. But this has created a problem: valuable data and business logic locked away in files, maintained through manual, siloed workflows that are difficult to scale, govern, or share.
Microsoft Fabric promises to change this—bringing your data into a unified platform where it can be structured, secured, and reused across any tool. But there's a challenge: migrating complex Excel models to Power BI and Fabric is time-consuming, technically demanding, and risky. How do you preserve intricate business logic? How do you maintain control whilst automating the heavy lifting?
Enter the Model Context Protocol (MCP)—the breakthrough that connects AI agents to Power BI and Fabric. These open-standard servers enable AI assistants like Claude to automate your migration: reading Excel structures, generating Power Query transformations, designing and building lakehouses, notebooks, semantic models, DAX measures and visualisations, and validating the entire workflow through natural language conversation. The AI does the grunt work while you maintain oversight at every critical decision point.
This hands-on training day walks through complete Excel-to-Fabric migrations using AI agents powered by MCP servers. You'll learn how to configure MCP servers, use context engineering to guide AI through your specific requirements, automate the migration workflow from Excel analysis through to deployed Fabric solutions, implement validation checkpoints and quality controls, and develop practical patterns for future migrations while preserving business logic.
Full Day Agenda
** Morning Session (09:00 - 12:30) **
Module 1: Understanding the Migration Challenge and MCP Setup (09:00 - 10:30)
- The Excel migration challenge: Why manual migrations are time-consuming, error-prone, and risk losing business logic
-Introduction to Model Context Protocol (MCP): How AI agents connect to Power BI and Fabric
- Architecture overview: MCP servers, Claude AI, and the Fabric ecosystem
- Hands-on Lab: Environment setup: Installing Claude Desktop/VS code and configuring Power BI and Fabric MCP servers; Authenticating and connecting to Fabric workspace; reading existing semantic models
- Hands-on Lab: Excel model analysis preparation: Understanding the sample financial model structure; Documenting business logic and calculation dependencies; Identifying data sources, transformation requirements, and output expectations
-Coffee break (10:30 - 10:45)
- Module 2: Context Engineering and Excel Analysis (10:45 - 12:30)
- Context engineering fundamentals: How to guide AI agents effectively
- Creating context documents: Business requirements, data dictionaries, validation rules
-Hands-on Lab: Using AI agents to analyze Excel models; Natural language commands to examine Excel structure and formulas; Generating documentation of business logic and dependencies; Identifying data transformation patterns and calculation flows; Creating a migration plan with AI assistance
-Hands-on Lab: Data source preparation in Fabric: Creating Fabric workspace and lakehouse structure; Using AI to generate initial bronze layer ingestion logic; Understanding how AI interprets Excel data structures
-Introduction to validation strategies: How to verify AI-generated outputs
Lunch (12:30 - 13:30)
** Afternoon Session (13:30 - 17:00) **
- Module 3: Automated Power Query and Lakehouse Generation (13:30 - 15:00)
- Translating Excel transformations to Power Query M code
Hands-on Lab: AI-assisted Power Query generation: Using natural language to describe transformation requirements; Generating Power Query/notebook code from Excel formula patterns; Creating silver layer transformations with business rule validation; Implementing data quality checks based on Excel validation logic.
- Hands-on Lab: Lakehouse and notebook creation: Using AI to design medallion architecture (bronze/silver/gold); Generating Python notebooks for complex transformations; Creating delta tables with appropriate partitioning strategies; Implementing incremental processing patterns; Validation checkpoint: Comparing outputs against source Excel calculations
Coffee break (15:00 - 15:15)
Module 4: Semantic Model and DAX Measure Creation (15:15 - 16:30)
- Understanding semantic model design requirements
- Hands-on Lab: AI-assisted semantic model development: Using MCP servers to create semantic models programmatically; Generating star schema from lakehouse data structures; Translating Excel formulas to DAX measures through natural language; Creating calculation groups for time intelligence patterns; Building relationships and configuring model properties
-Hands-on Lab: Visualization and report generation: Using AI to guide the creation of Power BI reports based on required outputs; Implementing interactive features beyond Excel capabilities
Module 5: Production Deployment and Advanced Patterns (16:30 - 17:30)
Version control and documentation: Maintaining AI-assisted code
Security considerations: Ensuring migrated solutions meet governance requirements;
Performance optimization: Tuning AI-generated queries and transformations
Advanced patterns for complex scenarios:
Resources and community support
Open Q&A
Transform your finance function witn Power BI, Fabric, ChatGPT, Copilot and AI Skills!
There is no doubt that the Microsoft data analytics and Gen AI platforms can help you re-imagine processes and re-invent an entire business function like Finance. But how? Whilst an interactive Power BI report (replacing the manual month end board pack for example) is an essential part of a solution, it is not likely to be enough by itself, especially in the age of stakeholders expecting Gen AI!
Instead you need to go through a methodical approach of understanding your data, capturing stakeholder requirements (functional and non functional!), using these to design a star schema semantic model, developing data engineering processes to shape and clean data, enriching your model with measures and metadata and then designing prompt engineering to train Copilot/a custom RAG model on your data!
That's a lot of work! Fortunately, Copilot and ChatGPT can help with every single step once you're working in the AI Powered platform of Microsoft Fabric!
In this session Rishi will show an example of Income Statement reporting being taken through this process, all with minimal code and just enough effort to make this an example use case that can be completed in days and weeks rather than months and years.
Guided workshop on Financial Analytics with Microsoft Fabric
Adventureworks is so yesterday! Learning how to work in MS Fabric with the standard datasets you come across in training exercises is relatively simple but there’s still a huge gap in trying to apply it to a complex business domain such as Finance.
In this hands-on tutorial you will work in Microsoft Fabric through the full end-to-end analytics cycle working with a messy, unstructured set of financial (accounting) data, building a General Ledger/Trial Balance supported with a Chart of Accounts and Organisational Hierarchy, complete with an Income Statement/Profitability report in Power BI.
This full-day tutorial will be supported by an interactive e-learning course (dozens of task-based modules each complete with short videos, quizzes/scenario interactions and “practice the clicks” exercises), hosted in a professional Learning Management System, which you will have access to for 12 months after the session. You will also be part of a vibrant community of over 800 Finance/BI professionals, many of whom engage in the online Power Platform Finance (PPF) discussion/support forums.
Designing and Implementing a Scalable Data Strategy with Microsoft Fabric
Organising, managing, securing and optimising data within an organization requires more than technology - it requires a strategy that involves people, processes and technology all working in harmony. Removing silos, implementing Governance and security mechanisms, enabling self-service and delivering timely, trusted insights to decision-makers.
As organizations continue to expand their data footprint across cloud applications, IoT systems, and operational databases, the need for a cohesive and scalable data strategy has never been more critical and never has it been more challenging to know even where to start.
This series of workshops presents a practical approach to designing and implementing a modern data strategy using Microsoft Fabric.
Anchored in real-world challenges derived from client case studies in Avanade (the largest Microsoft based consultancy globally), we explore how to move from fragmented, manual processes to an integrated, governed, and business-aligned data platform.
We’ll cover key architectural and operating model considerations—drawing from data mesh principles—to ensure that data is accessible, secure, and tailored to the needs of business users.
Key takeaways include:
- How to align data strategy with organizational goals using domain-driven design
- Establishing data governance and security using Microsoft Purview
- Integrating diverse data sources into a unified architecture
- Empowering business teams through curated, reusable data products and Power BI self-service
- Practical considerations for reducing technical debt and accelerating time-to-insight
Attendees will gain insights into how Microsoft Fabric components such as OneLake, Data Factory, Purview, and Power BI can be orchestrated to deliver enterprise-ready data products, enable self-service analytics, and support real-time decision-making at scale.
Whether you are a technical practitioner, architect, or business leader, this session will offer actionable guidance to help you unlock the full potential of your data estate using Microsoft Fabric.
Achieving AI ambitions in the Finance function with Microsoft Fabric and Copilot
Finance teams have always been data-driven, but they face unique challenges in harnessing modern data and AI tools. Concerns around risk, compliance, and data sensitivity, combined with reliance on IT teams and ungoverned spreadsheet logic, have often slowed their ability to fully embrace digital transformation.
Microsoft Fabric, with its integrated services and Copilot capabilities, now offers finance professionals a secure, low/no-code platform to own and accelerate their data journey.
But to be truly impactful, this adoption must be strategic, anchored in strong data governance, scalable architecture, and AI readiness.
In this session, you’ll learn how to:
- Connect to finance data sources (ERP systems, Excel, databases) using secure, scalable, and low-code approaches.
- Clean, reshape, and consolidate data using Fabric notebooks and Copilot with minimal technical overhead.
- Apply and enforce pre-defined schemas and data models to bring consistency and clarity across reporting.
- Enable governed data sharing at all pipeline stages - raw, cleaned, and conformed, ensuring accuracy and trust.
- Utilise Fabric and Copilot to create AI-ready artifacts, enabling capabilities like automated commentary, data Q&A, and intelligent agents.
By the end, we’ll demonstrate how to produce a fully interactive, AI-assisted month-end board pack in Power BI, maintained directly by finance teams, delivering insights that stakeholders can explore autonomously through natural language and rich visual storytelling.
Whether you're a finance leader, data professional, or analytics enabler, you’ll walk away with practical, proven techniques to:
- Minimise technical debt.
- Reduce dependency on IT.
- Enhance data trust and usability.
- Build a foundation for scalable AI in finance.
This session is your blueprint for turning fragmented finance data into governed, self-service insights, unlocking the full value of Microsoft Fabric and Gen AI in your finance function and beyond.
Power BI, Fabric and role-based Copilots - how to get them to do what you expect!
There is no doubt that Generative AI, Including tools like Microsoft Copilot, will transform the way we analyse and report on data. But by default, at least for now, it won’t automatically be able to source and understand your corporate data in the same way as ChatGPT can with the data on the internet (on which it has been trained over years with trillions of data points!). The raw data in your organisation needs to be modelled and enriched with the business-specific domain logic/context in order to be able to evaluate performance and provide meaningful insights.
In this session Rishi will:
-Highlight the current capabilities of Power BI and role-based Copilots (e.g Copilot for Finance)
-Discuss the value of metadata for data stored in lakehouses/semantic models and consider how they tie into prompts to help guide Copilot on how to answer questions on your data!
-Look at where Copilot experiences fit in across the entire analytics workflow, from understanding requirements/user stories through to data transformation, modelling/enrichment and process automation.
You will come away with this session with an idea of why applying Copilot to analytics can be challenging and how you can provide it with enough context to help with data-driven insights keeping you firmly in the drivers seat!
Governance with Microsoft Fabric - A role play based workshop:Part 2
Imagine yourself as a data analyst in the middle of trying to educate and convince senior stakeholders in a large organisation (e.g Head of IT Security, Head of IT, CDO, COO) about how Microsoft Fabric can be securely implemented and Governed. There are debates around how to balance the requirement for self-service with implementing sufficient technical controls to restrict access to potentially sensitive data.
All in an organisation where Power BI has brought a lot of value to the business but a lot of pain for IT, essentially forming part of a growing, ungoverned domain of “Shadow IT”. So these stakeholders aren’t immediately sold on the idea of bringing more analytics items into the mix with Microsoft Fabric.
In this session you will be immersed into a series of role-play scenarios with these stakeholders, delivered through an interactive e-learning course done as part of a live group workshop. The content is split into 6 core sections (modules) with a live peer brainstorming exercise and interactive Q&A at the end of each module.
In part2 we cover the second set of three topics:
Module 4: Data Governance
Understand the principles and key activities required to set up:
- Integration between MS Fabric and Purview for Data lineage /impact analysis, Data Cataloguing, and automatic scanning/identification of sensitive data
- Information Protection, including Data Loss Prevention Sensitivity Label and Data encryption.
- Data Safeguarding including Data residency, data recovery and end-to-end auditing as required to meet compliance requirements.
Module 5: Security
Define and implement security controls around access permissions, including:
- Data level security such as Row/Object level security and OneSecurity at the lakehouse/file level.
- Workspace and artefact security including workspace roles, sharing permissions and delegated data permissions.
- Identity and Network security including Gateways, private endpoints and Microsoft Entra Private Access.
Module 6: The Organisational Operating Model.
Brining it all together and applying the principles of the Fabric Adoption Roadmap to your organisation for end user and developer enablement.
This includes including ensuring the right data culture and setting up an Analytics/BI centre of excellence so that all users and admins have the appropriate level of support and the right environment for learning and continuous improvement!
Governance with Microsoft Fabric - A role play based workshop:Part 1
Imagine yourself as a data analyst in the middle of trying to educate and convince senior stakeholders in a large organisation (e.g Head of IT Security, Head of IT, CDO, COO) about how Microsoft Fabric can be securely implemented and Governed. There are debates around how to balance the requirement for self-service with implementing sufficient technical controls to restrict access to potentially sensitive data.
All in an organisation where Power BI has brought a lot of value to the business but a lot of pain for IT, essentially forming part of a growing, ungoverned domain of “Shadow IT”. So these stakeholders aren’t immediately sold on the idea of bringing more analytics items into the mix with Microsoft Fabric.
In this session you will be immersed into a series of role-play scenarios with these stakeholders, delivered through an interactive e-learning course done as part of a live group workshop. The content is split into 6 core sections (modules) with a live peer brainstorming exercise and interactive Q&A at the end of each module.
In part 1 we cover the first three topics:
Module 1: Overview.
Understand the high level scope of what is required to effectively Govern and manage a Fabric/Power BI estate.
Module 2: Microsoft Fabric as part of a Data Mesh Architecture.
Understand how the principles of Data Mesh apply to Microsoft Fabric and how it solves many of the challenges around data silos and dependencies on specialised technical teams.
Module 3: Administering and monitoring the Fabric environment:
- Manage Fabric capacities from a performance and consumption perspective
- Design feature controls-
using the Fabric Admin portal and
- know how to analyse Fabric adoption and usage!
Financial Modelling in the Banking Industry
Adventureworks is so yesterday! Learning how to build one of the reports you come across In training exercises is relatively simple but there’s still a huge gap in trying to apply to a complex business domain such as Finance. Trying to create linked financial statements (e.g. P&L and Balance Sheet) in Power BI, where dynamic calculations need to flow across the strictly formatted visuals, is challenging to say the least! Add building a financial model into the mix, where you want to see the effect of tweaking certain (economic) variables to see the impact on profitability/liquidity, and you require some serious DAX mastery!
In this session you will learn how to:
• Take industry Income Statement/Balance Sheet data for 4 Banks (Barclays, Citi, HSBC, Lloyds) and turn it into a financial reporting solution in Power BI
• Create an interactive benchmarking report comparing the financial performance of the Banks on key metrics such as Income Growth, Cost:Income Ratio and Current Ratio
• Create a simple Income Statement and Balance Sheet for each bank
• Link the Income Statement and Balance sheet together through Retained Earnings
• Introduce what-if scenario modelling for key variables such as headcount growth and annual change in loans/deposits
Attending this session will allow you to understand how Power BI can be applied to a complex real-life scenario and show how to use Tabular Editor 3 to script and organise measures so that you can adjust and apply them to any similar models with the click of a button!
Achieving AI ambitions in the Finance function with Microsoft Fabric and Copilot
In any organisation, the Finance team is naturally driven by data and analytics but they have often been slow to adopt new technologies because of risk/governance (working with the most sensitive data in the firm!), challenges with climbing the steep learning curve, over reliance on IT teams, and the complexities of applying finance logic which most commonly exists in people’s heads or in spreadsheets!
The self service nature of Microsoft Fabric (which includes Power BI) and Copilot studio makes it accessible to business users, but it needs to be deployed as part of an overall data/AI strategy in order to address all the concerns and avoid building additional technical debt
In this session Rishi and Leon will show:
- Low/no code approaches to connecting to finance data (e.g ERP systems amd spreadsheets) in a secure and performant manner
- Consolidating, cleaning, and re-shaping transactional data using Fabric Copilot and notebooks
- Using pre-defined data schemas to ensure data is represented in a consistent format for analytics
- Securely sharing data and reports at different stages of the data pipeline (raw, cleaned, conformed) to ensure a single version of the truth
- Building AI solutions on top of the data models (e.g AI skills linked into Copilot/agents)
The end result of this will be a month end board pack in Power BI which stakeholders can self serve on through visual interactions and Power BI copilot, which is supported by a set of connected artifacts in Fabric that can be maintained by the finance team.
Attendees will learn techniques to ensure that their data is clean, accessible (by both Gen AI and humans!), enriched with metadata such as descriptions/ relationships and with appropriate business context embedded into logic. These are all essential elements to ensure that companies can take advantage of the Gen AI revolution in the finance sector and beyond!
Building a credit rating model in Power BI
This session will explore how to transform a traditional Excel-based credit rating model for countries into a modern, interactive, and scalable solution using Power BI and Microsoft Fabric. Participants will discover innovative techniques to enhance financial modeling and storytelling by leveraging advanced analytics and integration capabilities.
Hear from Power Platform in Finance specialist and Microsoft MVP, Rishi Sapra, as we delve into the following areas:
• Replicating Complex Rating Logic: Translating intricate credit rating formulas and methodologies from Excel into DAX to ensure consistency and accuracy.
• Interactive Insights: Using slicers, bookmarks, and contextual visualizations in Power BI to enable dynamic comparisons and highlight key drivers of credit ratings.
• Streamlined Reporting: Designing intuitive and user-friendly reports that empower stakeholders to explore insights interactively.
• Scalability with Microsoft Fabric: Harnessing the Medallion Lakehouse architecture to enable seamless data integration, metadata management, and collaboration at an enterprise level.
This session is particularly beneficial for finance professionals and analysts who are seeking to modernize their financial or operational models. Attendees will leave with practical strategies to maintain model accuracy, enhance interactivity, and scale their solutions to meet the evolving demands of modern analytics.
Tips and Tricks for working with Finance data in Power BI
Excel has always been the tool of choice for the finance team with the flexibility it provides for logic, formatting and presentation of numbers. But this flexibility has also caused Governance nightmares, performance issues and huge risks with manual processes. Is it possible to also achieve the desired outcomes and flexibility with Power BI whilst also having all the benefits of working in a more controlled, automated and feature-rich environment? Yes! In this session Rishi will show how you can have your finance cake and eat it, showing how to build dynamically formatted financial statements , waterfall charts and KPIs in Power BI to tell an engaging story with finance data. This will be based on the data and reports shown in the blog series at https://aka.ms/pbiincomestatement and the role play session at https://tinyurl.com/PPFroleplay
Aggregations and What-if Scenario Modelling in Power BI
Power BI isn’t just a read-only historical view of your data - you can also enter parameters which feed into your model and drive outputs under different scenarios. In this session Rishi will show this process using a simple Power Apps form embedded into Power BI, supported by a basic workflow. This solution is made even more seamless by combining a Direct Query connection to the parameter table with an imported data model on the full dataset, utilising the new Composite Model feature of Power BI. Also learn how to utilise this feature to build aggregate tables – fast in-memory tables for viewing data grouped by one or more dimensions with an automatic switch to Direct Query when drilling down into transaction-level views. The sheer breadth of uses cases to which this can apply, and the performance/power it can bring to your reports is guaranteed to take your breath away!
European Microsoft Fabric Community Conference 2025 Sessionize Event
Power BI & Fabric Summit 2025 Sessionize Event
Month of Copilots Sessionize Event
SQLBits 2024 - General Sessions Sessionize Event
#DataWeekender v4.2 Sessionize Event
South Coast Summit 2021 Sessionize Event
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top