Aasne Holtklimpen
Senior Architect - Microsoft AI and Compliance Lead | Microsoft Copilot MVP | Microsoft MCT
Kristiansand, Norway
Actions
Åsne Holtklimpen bridges the gap between technical operations and business strategy.
As a Senior Architect at Crayon (soon to be Software One), she is leading the work on helping enterprises mitigate risk while deploying advanced cloud solutions.
Recognized as one of the 50 Top Women in Technology in 2022, Åsne combines deep technical expertise with a strong community presence.
Her expertise centers on the architectural realities of the Microsoft ecosystem, specifically Microsoft Entra, Purview, and M365 collaboration tools. Åsne moves beyond the hype of AI to address the critical governance required for Microsoft Copilot, focusing on data hygiene, lifecycle management, and access control.
A recognized voice in the global community and a Microsoft MVP, Åsne ensures that organizations can leverage intelligent tools safely and effectively, transforming complex security requirements into actionable business value.
Area of Expertise
Topics
Solving the Security-Availability Paradox
In modern cybersecurity, rigid access controls often conflict with business continuity during emergencies. When key personnel are off-grid or abroad during a critical incident, organizations frequently face a binary choice: compromise security standards for speed, or risk operational failure.
This session presents a third option: Context-aware resilience.
Åsne Holtklimpen and Per-Torben Sørensen will unpack the architectural integration of Microsoft Entra and Microsoft Purview. They will showcase how to design a "Break Glass" mechanism that relies on data sensitivity and user context rather than static network perimeters.
Data Readiness and Security Prerequisites for Microsoft Copilot
Successful implementation of Microsoft Copilot requires more than just license assignment; it demands a robust foundation of information security and data governance. Without proper preparation, organizations risk exposing sensitive data through unauthorized AI retrieval.
This session outlines the essential architectural and compliance steps required before and during Copilot deployment. Key topics include:
- Data Classification: Utilizing Microsoft Purview to label and protect sensitive assets.
- Access Management: Remedying "technical debt" in permission structures to prevent over-sharing.
- Compliance Strategy: Aligning AI use with GDPR and internal security policies.
- User Adoption: Training methodologies for secure AI interaction.
Attendees will gain actionable insights into structuring their data environments to ensure Copilot delivers value without compromising information security.
How on earth do we stay in control in a Copilot world?
We are in the middle of a technological quantum leap – AI agents, automation, and Copilots working around the clock for us. It’s powerful. It’s efficient. But let’s be honest: how much control do we really have when AI reads, interprets, and acts on our behalf at record speed?
Welcome to a new everyday life with intelligent assistants
Microsoft Copilot is no longer just a tool – it’s a digital colleague. It pulls insights from Teams, Word, Outlook, SharePoint, and Planner. It suggests, writes, prioritises, and even thinks – almost. And the big question is:
How do we make sure it does this correctly, safely, and with the right access?
Urgent access without security compromise!
Imagine this: Your Lead Architect is on holiday abroad. A major incident hits your home base, requiring their immediate intervention on a highly sensitive project. The clock is ticking. Do you compromise your security standards to let them in, or keep the door locked and risk the project?
You don't have to choose.
Join Åsne Holtklimpen and Per-Torben Sørensen as they reveal how to solve the "Security vs. Availability" paradox. Learn how to leverage the integration of Microsoft Entra and Microsoft Purview to create a dynamic access solution that adapts to crises without lowering the drawbridge.
Key takeaways include:
- Dynamic Geo-Filtering: Configuring Conditional Access to adapt to user location without exposing the environment.
- Label-Based Logic: utilizing Sensitivity Labels as the trigger for granular access controls.
- Global Governance: Implementing Microsoft Information Protection (MIP) to secure data assets everywhere, from the corporate office to a hotel Wi-Fi.
Hit PLAY on Copilot - What could possibly go wrong`?
What happens when you want to hit PLAY and give yourself the turbo boost you need to accelerate user adoption? And when you hit PAUSE and say: “Wait a moment!” before unleashing AI. Here we’ll give you the most important safety measures you need to have in place – otherwise, things can go wrong.
This is not just a presentation – it’s a wake-up call with real stories, practical tips, and experiences from the real world.
So… will you press Play or Pause?
Come and find out what happens when you give Copilot control – with both your head and your heart in the right place.
Wild Ideas, Safe Agents
Innovation requires freedom, but enterprise AI requires control. How do you get both?
This session explores the delicate art of allowing your Copilot Studio agents to run free with creativity while keeping them tethered to reality.
We will walk through the lifecycle of an agent, showing you how to capture exciting ideas and discipline them into reliable business processes. Come curious about the possibilities, and leave with the knowledge to build safe, effective, and compliant bots.
Securing the Source: Strategic Data Governance for Copilot and Beyond
As organizations rush toward AI adoption, the underlying health of their data estate has become a critical business risk. Before deploying Copilot for Microsoft 365, you must address the "elephant in the room": massive, unmanaged accumulations of sensitive data-at-rest.
This session bridges the gap between high-level governance and technical execution. We will explore how to use Microsoft Purview and Defender for Cloud Apps not just as tools, but as a strategic framework for risk reduction. You will walk away with a roadmap for identifying data hotspots, understanding the cost-impact of governance, and ensuring your organization’s AI journey is built on a secure foundation.
Key Takeaways:
- The business case for data discovery in the age of AI.
- Building a repeatable governance lifecycle for SharePoint and Dataverse.
- Cost considerations and ROI of Data Security Posture Management (DSPM).
Mission Critical: Hardening Your Data Estate for Copilot for M365
Is your organization actually ready for Copilot? AI doesn’t create security holes; it finds the ones you already have. The most common pitfall for AI adopters is "over-sharing" and the accumulation of sensitive data in forgotten corners of SharePoint and OneDrive.
This session provides a tactical blueprint for the "Pre-Copilot Clean-up." We focus on using Microsoft Purview and Defender for Cloud Apps to identify sensitive data hotspots before they are indexed by LLMs. Learn how to automate sensitivity labeling and use discovery findings to harden your security posture, ensuring that Copilot only surfaces what it should.
Key Takeaways:
- The "Copilot Readiness" checklist for data security.
- Rapid discovery methods to find "at-risk" sensitive data.
- Automating labels and permissions to prevent AI data leakage.
Building the Foundation: Structural Data Security for the AI Era
The success of an AI deployment is determined long before the first prompt is typed. It is built on the integrity of your data estate. Many organizations find that "over-sharing" in SharePoint and OneDrive is the single greatest barrier to a successful Copilot for Microsoft 365 implementation.
In this session, we provide a structured, tactical blueprint for hardening your data infrastructure. We will demonstrate how to leverage the synergy between Microsoft Purview and Defender for Cloud Apps to create a "clean-room" environment for AI. You will learn a repeatable methodology for discovering sensitive data accumulations, validating permissions, and implementing automated labeling to ensure that Copilot respects the boundaries of your most critical information.
Immediate Tactics to Prevent Copilot Data Leaks
This session is your emergency kit for the "Pre-Copilot Clean-up." We skip the fluff and dive straight into how to use Microsoft Purview and Defender for Cloud Apps to hunt down data hotspots before the LLM indexes them.
AI has a way of finding exactly what you didn't want it to see. The reality is that Copilot for Microsoft 365 doesn't break your security model, it simply shines a high-powered flashlight on the "over-sharing" and data sprawl that’s been hiding in your environment for years. If your SharePoint and OneDrive are cluttered with sensitive "dark data," your AI rollout is a liability waiting to happen.
Join us to learn how to identify, quarantine, and label sensitive info at lightning speed, ensuring your AI experience is powerful, not perilous.
Key Takeaways:
- The "Triage" List: Which data types pose the biggest risk to your AI rollout?
- Search & Destroy: Using Content Search to find and remediate "over-shared" folders.
- Automated Guardrails: Setting up auto-labeling policies that work while you sleep.
Governance and Operational Control in the Era of AI Agents
The integration of automated AI agents and Microsoft Copilot represents a significant shift in operational efficiency. However, as AI tools increasingly read, interpret, and act upon data from Teams, Outlook, and SharePoint, organizations face new challenges regarding data control and decision-making authority.
This session addresses the governance implications of deploying intelligent assistants as "digital colleagues." We will examine the critical balance between efficiency and oversight, focusing on:
- Data Access Boundaries: Managing what Copilot can access and interpret.
- Automated Decision Making: Establishing guardrails for AI-driven actions.
- Information Lifecycle: Ensuring data hygiene in connected M365 workloads.
Join this session to learn frameworks for maintaining organizational control while maximizing the productivity benefits of Microsoft Copilot.
How to Architect Zero Trust AI
You’ve deployed M365 Copilot. The licenses are assigned, and the excitement is high. But so is the risk. Now that the "toy phase" is over, how do you operationalize AI in a verified Zero Trust architecture?
This session leaves the basics behind. We won’t discuss what a label is; we will discuss how to automate label application using trainable classifiers to protect data Copilot can access. We will dive into the Semantic Index, how to manipulate what it surfaces, and explore "Just-in-Time" access for AI agents.
Target Audience: Enterprise Architects & Security Leads who are done with the basics and ready to harden their AI environment.
Detecting and governing "Bring Your Own
Your users found a way around your Copilot restrictions on day one. They are using "Shadow AI", unapproved LLMs and PDF analyzers to do their jobs. Blocking them inhibits innovation; ignoring them invites a breach.
This session explores the architectural patterns for "Safe AI Adoption." We will look at using Defender for Cloud Apps to uncover Shadow AI usage, configuring Endpoint DLP to sanitize data before it hits an external model, and how to govern "Bring Your Own Model" scenarios on Azure.
It's time to stop fighting the tide and start building the dam.
Unlocking Digital Safety: Mastering Information and Identity Security in M365
This session is ideal for those looking to understand the critical role of Microsoft Purview and Copilot Security in creating a secure information and identity environment within the M365 universe. The session highlights the importance of using Microsoft Purview and Copilot Security to ensure a secure and compliant environment within the M365 universe, and aims to provide insights into how these tools can help manage and govern data, safeguard against unauthorized access, and mitigate compliance risks.
Ever wondered how to ensure using Copilot and other digital tools remains safe and effective? What roles do Microsoft Purview and Copilot Security play in creating a secure information and identity environment? This session explores why confidentiality, integrity, availability, identity security, and legal compliance are crucial. Learn how M365 tools, combined with Microsoft Purview and Copilot Security, provide the security foundation for your data, ensuring only authorized access.
Despite ongoing discussions about securing data with Copilot, many organizations still face challenges in prioritizing it and understanding its importance. In this session, you will discover the essential role of Microsoft Purview and Copilot Security in safeguarding your environment from unauthorized access, data breaches, and compliance risks. You will also learn how these tools can help you manage and govern your data across M365 and other platforms.
Copilot og informasjonssikkerhet
Har vi lært å sikre data?
Det ser ut til at mens noen organisasjoner har styrket sine datahåndteringsprosesser, har andre oppdaget “gammel gjeld” i form av eksisterende sikkerhetshull eller mangler i tilgangsstyringen. Dette har ført til en viktig læringskurve for mange, og understreker behovet for kontinuerlig vurdering og forbedring av informasjonssikkerhetspraksiser.
Så hvordan kan vi sørge for at vi sikrer oss?
NIC Rebel Edition Sessionize Event
MVP-Dagen 2025 Sessionize Event
NIC Empower 2024 Sessionize Event
MVP-Dagen(e) 2024 Sessionize Event
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top