

Danny Kruge
Mission Critical Engineer from Schuberg Philis
Laren, The Netherlands
Actions
Danny Kruge is a seasoned data professional with a robust background in SQL Server and over a decade of experience in the field. Originally from Leicester, England, and now residing in the Netherlands, he has dedicated years to honing his skills and expanding his expertise into areas like Azure and Windows Engineering. Danny is particularly passionate about Azure Data Factory, where he leverages his deep knowledge to design efficient data workflows. As an active community leader, he organizes the Azure Saturday Netherlands event and the Azure Heroes user group, helping to foster knowledge-sharing and collaboration among tech enthusiasts. Committed to continuous learning and community support, Danny enjoys sharing his insights to help others navigate data challenges and avoid common mistakes.
Area of Expertise
Topics
Seamless Cross-Tenant Azure SQL Migrations with Geo-Replication
Discover how to perform seamless, no-downtime migrations of Azure SQL databases across tenants using geo-replication and the Microsoft backbone. This session will delve into the technical steps to set up secure cross-tenant geo-replication without exposing public endpoints. Learn how to maintain read-only replicas in the current tenant to keep applications running during the migration process, ensuring business continuity. Whether you’re planning a tenant-to-tenant migration or exploring strategies for hybrid operational setups, this session will equip you with practical insights and best practices to achieve secure, efficient, and downtime-free transitions.
Harnessing Microsoft Fabric for Real-Time Crypto Data Analysis with Azure Data Factory and Power BI
In today’s rapidly changing financial landscape, businesses need timely insights to make effective decisions. This session will dive into a real-world application of Microsoft Fabric, showcasing how to ingest and analyze cryptocurrency data in near real-time. We’ll walk through a complete, step-by-step solution—from ingesting crypto data through Azure Data Factory, storing it in a Lakehouse, and making it available in Power BI every two minutes through a semantic model. Attendees will learn how to set up this architecture, maximize the capabilities of Microsoft Fabric, and deliver fast, actionable insights for data-driven decisions.
Unleashing the Power of OLAP: Combining PowerShell and DuckDB to Transform Data Analysis
Unlock new possibilities in data analysis by combining the strengths of PowerShell with the versatility of DuckDB. As someone with a SQL background and no Python experience, I discovered how to wield OLAP techniques within PowerShell, enabling powerful querying and data manipulation capabilities against Parquet files both locally and in Azure.
In this talk, I will share my journey of integrating DuckDB with PowerShell, demonstrating how I was able to set up a basic OLAP infrastructure on my local laptop—something I had never achieved before. Investigating Parquet files for data errors has never been easier, thanks to DuckDB's innovative approach.
Join me as I provide a practical demo and explore the possibilities that arise from merging these two languages. Whether you're a SQL professional looking to expand your toolkit or someone intrigued by the potential of local and cloud-based OLAP solutions, this session will reveal how to leverage PowerShell and DuckDB to elevate your data analysis capabilities.
Automating SQL Permissions Scripting for Daily Audits and One-Click Deployment
Discover the game-changing strategy to secure your SQL environment at our session, where we unveil an innovative approach to daily scripting of SQL permissions against the renowned StackOverflow2010 database. Join us and learn how this practice can transform your database auditing and security protocols.
Key Takeaways:
Audit and Compliance Mastery :
Learn how to seamlessly script all permissions to meet stringent auditing standards.
Gain the knowledge to simplify compliance reporting and certification processes.
Enhanced Traceability :
Understand the importance of maintaining an impeccable history of permission changes.
Acquire skills to trace permissions back to specific roles and users, enhancing accountability.
Boosted Disaster Recovery Readiness :
Dive into techniques for rapid restoration of permissions in case of accidental changes or losses.
Ensure your organization can quickly recover and maintain operational continuity.
Top-Notch Security Management :
Get insights into identifying and managing unnecessary permissions to fortify your security posture.
Master regular reviews and adjustments to align with the principle of least privilege.
I created and used these scripts which I offer on Github for all to use, I used them for multiple migration projects to make sure permissions were not an issue on the SQL Server Level.
Did i forget to mention you can deploy this in one click!
Efficiently Migrating 8TB of Data in One Day: Leveraging PowerShell and Commvault
Join us for an in-depth discussion on how we successfully completed the complex task of migrating 8TB of data across data centers in just one day. This session will delve into the techniques and strategies employed, focusing on the powerful combination of PowerShell scripting and Commvault for data transfer. Additionally, we'll explore the implementation of on-prem SQL Server 2017 and how multiple PowerShell and T-SQL techniques facilitated a smooth and efficient migration process. Whether you are preparing for your own data migration or looking to optimize your current methodologies, this talk will provide valuable insights and practical tips to ensure success.
Automated DWH Restoration: Creating Efficient DACPAC and Restore Pipelines
Discover how we transformed the data warehouse (DWH) restoration process into a fully automated, efficient, and effortless routine. In this session, we'll dive into the architecture and implementation of our DACPAC and restore pipelines, leveraging the powerful combination of Octopus Deploy and PowerShell scripts with Commvault as our backup tool. We'll explore step-by-step how we orchestrated these technologies to seamlessly automate the entire process, allowing customers to restore a complete DWH in just a few hours for extensive testing without any hassle. Learn practical insights and best practices from our experience to replicate similar success in your own environments, enhancing your data recovery and deployment capabilities.
Unlocking the Power of Optimization in Azure Data Factory
Imagine deploying a project that seamlessly moves data from on-premises to Azure and Power BI—only to find your daily reports taking a staggering 18 hours to run in production.
In this session, I'll walk you through how I transformed this situation by rearchitecting the solution, significantly optimizing the performance. You’ll get an insider’s look at the different architectural designs and solutions available, and I'll share the good, the bad, and the ugly of working with Azure Data Factory.
Whether you’re looking to fine-tune your Data Factory pipelines or seeking ways to make your business more agile with faster data processing, this session is packed with practical insights. Discover how Azure Data Factory can help you achieve more, with less complexity and greater speed. Join me to learn how to turn potential roadblocks into streamlined, high-performance data solutions.
Chat With Azure SQL
In the evolving world of data, the ability to interact with your datasets and databases using natural language is a game-changer. This session will introduce you to the power of Azure OpenAI, Azure SQL, and Microsoft Fabric, showing you how to break down barriers that have traditionally required knowledge of T-SQL or other coding languages. By leveraging Azure OpenAI and Azure AI Search, you’ll learn how to ask questions of your data—like forecasting future trends—without writing a single line of code. We'll explore how to connect various databases and datasets, implement data cleansing, and ensure data quality through advanced prompt engineering. Whether you're a data professional or a business leader, this session will revolutionize the way you think about and work with data. Don't miss out on discovering the future of talking with your data.
Danny and the Data Factory
In my session which i presented at Azure Saturday NL i will be showing a full working data solution. Using Azure function which are called from Data Factory every 2 minutes i move data to Data Lake Storage and using Data Factory move the data into Azure SQL, I then transform the data and also use CDC to move the data in Deltas across tables from Data Factory.
Once complete the data is then updated and presented in Power BI, The session is 50 minutes and I go through explaining what is data factory from the beginning and show the power of moving data explaining a long the way.
WTF (What The Fabric)
Microsoft Fabric? Something you may have heard of? Over the years i have used Azure Data Factory and Synapse which publishes data towards Power BI, Now a game changer is occurring called Fabric! so What the Fabric is happening ;)
I hope to show you use cases and real world scenarios this can be utilized. A setup using PaaS solution and Fabric being the SaaS solution providing the same concept.
Utilizing Data Factory, Synapse, Power bi within Fabric with Data Activator also.
How and when to migrate to Fabric? Big Bang? In Segments and what you need to consider.
Fabric is one product and that may not be one size fits all.
Target audience goes from Data Engineers to Sales, I will walk through the thought process behind Fabric and what audience it really captures. One product cannot be one size fits all.
Talk to Your Data: The Future of Data Interaction with AI
In the evolving world of data, the ability to interact with your datasets and databases using natural language is a game-changer. This session will introduce you to the power of Azure OpenAI, Azure SQL, and Microsoft Fabric, showing you how to break down barriers that have traditionally required knowledge of T-SQL or other coding languages. By leveraging Azure OpenAI and Azure AI Search, you’ll learn how to ask questions of your data—like forecasting future trends—without writing a single line of code. We'll explore how to connect various databases and datasets, implement data cleansing, and ensure data quality through advanced prompt engineering. Whether you're a data professional or a business leader, this session will revolutionize the way you think about and work with data. Don't miss out on discovering the future of talking with your data.
In the evolving world of data, the ability to interact with your datasets and databases using natural language is a game-changer.
Mastering Data Migration: Essential Standards and Protocols for Any Scale
With years of experience managing successful data migrations, I invite you to explore the essential standards and protocols that ensure smooth transitions, whether dealing with small databases or vast systems. This session outlines best practices for planning, executing, and validating migrations to maintain data integrity and security with minimal downtime. Drawing from my extensive track record of successful migrations, I’ll share insights into overcoming common challenges and highlight strategies to tailor your approach for projects of any size. Equip yourself with the expertise needed for flawless migrations every time.
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top