Azure Data Platform Azure SQL Server Microsoft Azure Microsoft SQL Server High Availability performance tuning Azure Synapse Azure Data & AI Azure Data Factory Azure Arc Data Services
Boston, Massachusetts, United States
Taiob Ali, Microsoft Data Platform MVP, is an accomplished technical leader with a proven record of success. During his last 15 years, he has worked with the Microsoft Data Platform and MongoDB, both on-premises and cloud. His experience includes all three major business sectors: finance, e-commerce, and healthcare.
Taiob is currently working at “GMO LLC” as Database Solutions Manager, focusing on cloud migration, automation, improving, and streamlining operational workflow. He is a regular speaker at local and virtual chapters, Data Saturdays, and Azure conferences. He is a board member of New England SQL Server User Group, founder of 'Database Professionals Virtual Meetup Group', and organizer of Boston Data Saturday.
Many tools are available to migrate your on-premises database to an Azure SQL database. Are you familiar with all of those tools, and how do you choose the best tool for you? How do you analyze and identify what objects are not compatible with migrating to Azure? Answer: It depends (of course) on the type, size, and complexity of the database you will be relocating.
This session will explore considerations before migration, appropriate targets, migration tools available, and the pros and cons of each tool. I will demo four different tools that you can use to analyze/migrate your on-premises SQL Server Database to Azure SQL.
At the end of this session, you will be aware of the various techniques available to analyze and migrate SQL Database to Azure and choose the best fitting one for your database.
How do you consume the data once you enable event logging (auditing and diagnostic) for the Azure SQL database? How do you find anomalies to alert on, establish a baseline, look at trends?
In this 100% demo session, I will show you kql magic, kql kernel, and the recently released Azure Monitor Logs extension for Azure Data Studio to consume event logs of the Azure SQL database.
An 8K page is the fundamental unit of data storage in SQL Server. For performance reasons, SQL Server performs every data modification operation in memory (buffer pool) and does not write it back to disk immediately.
This is where checkpoint comes into play. The Database Engine periodically issues a checkpoint on each database. A checkpoint writes the current in-memory modified pages (known as dirty pages) and transaction log information from memory to disk. It also records this information in the transaction log.
This session will explain why you should care and know about the checkpoint process and the different checkpoints that SQL Server does. I will show you exactly what happens during a checkpoint, how you can influence the interval of checkpoints, changes made with checkpoint settings in SQL 2014 and SQL 2016+.
Extended Events offers a rich filtering mechanism, called predicates, that allows you to reduce the number of events you collect by specifying criteria that will be applied during event collection. By evaluating predicates early in the event firing sequence, we can reduce the performance impact of collecting events by stopping event collection when the criteria are not met. You can specify predicates on both event fields and on a special object called a predicate source. Applying predicates on predicate source you can customize your collection and answer the critical questions very quickly.
In this demo session, I will show you how you can collect only the nth number of events. You can also use this trick to stop a collection automatically when you are only looking for if an event is occurring or no?
As database professionals, we all know how to manage SQL Server Agent jobs using SQL Server Management Studio (SSMS), TSQL, and PowerShell. With Azure Data Studio and SQL Server Agent extension, we have new tools to create, maintain and manage SQL Agent jobs using Notebook. These are called Notebook Jobs. Once you successfully execute a Notebook job manually or via schedule, each run's results are saved into a separate Notebook, which you can easily share with others.
In this 100% demo session, I will walk you through every step of Notebook Jobs.
The SQL Server IaaS Agent extension (SqlIaasExtension) runs on SQL Server on Azure Virtual Machines (VMs) to automate the management and administration tasks.
This session will show you how to integrate SQL Server on Azure VMs with the Azure portal using one of the three management modes. How the three modes differ from each other, and what feature benefits you get from each mode. How can compliance and license management be simplified using SqlIaasExtension? At the end of this session, you will be familiar with tools to automate SQL Servers' management running on Azure VM.
SQL Server uses a phase during query optimization, called cardinality estimation (CE). This process makes estimates based on the statistics as to how many rows flow from one query plan iterator to the next. Knowing how CE generates these numbers will enable you to write better TSQL code and in turn, influence the type of physical operations during query execution. Based on that estimated rows, the query processor decides how to access an object, which physical join to use, how to sort the data. Do you know how the CE generates these numbers? If your query has only one predicate, the query optimizer will use the histogram to estimate how many rows will be qualified. What happens when you have multiple predicates, range predicates, variable values that are “NOT KNOWN” to the optimizer, or you have predicate values increasing in ascending order? Do you know what will happen if your predicate is using an amount that is outside of the histogram range?
In this session, I will show you how the cardinality estimator estimates in all of these scenarios. You will walk out of this session with a clear understanding of how the CE generates its numbers and ready to tackle those nasty, hard-to-solve query plans.
Is there a way to fix query performance issues without any code changes? Making application changes can be an expensive undertaking. Hence, developers and DBAs want the query processor to adapt to their workload needs vs. using options and trace flags to gain performance. Adaptation is the idea behind Intelligent Query processing in the newer versions of SQL Server. In this demo-heavy presentation, I will walk you through seven different intelligent query processing features that have been introduced in the last two versions of SQL Server. We will look at both the pros and cons of using these features and control them if you are not ready to use them yet.
By attending this session, you learn the new capabilities of Intelligent Query Processing. You equip yourself with powerful tools to convince your peers to change the database compatibility level to 150 for the Azure SQL Databases or upgrade to SQL Server 2019 for on-premises databases.
Do you want to get the current 232 suggested best practices from the SQL Server Team and evaluate your environments? These best practices are available for all instances, from your Data Center to the cloud.
In this all demo session (Yes, no slides), I will show you both the "what" and "how" to use the SQL assessment API. You will learn if a setting is changed after your initial check. How to customize your checks and create your own rules that fit your environments? The demo will use SQL assessment API, PowerShell cmdlets, and the new 'SQL Assessment' extension in Azure Data Studio to export assessment results in a SQL table and create an HTML report.
At the end of this session, you will be equipped with the toolset to set up and run your own assessment, create customized rules, and determine your compliance score against suggested best practices.
SQL Server Management Studio (SSMS) is now updated each month with the latest version of the SQL Client Tools, enabling rapid enhancement to the Execution Plan.
Do you know that now you can see how many pages are spilled to disk for a sort, as well as the memory statistics for sort operator? Are you aware that you can see the top 10 waits for single execution and what trace flags were active during compilation, including the scope of the trace flags? Have you ever analyzed a big plan and wish you could search for the table name, index name, or column name without opening the XML plan? When you see an index used, do you know for sure which statistics were used by the Query Optimizer to estimate the number of rows?
I will show you how to find and use new features introduced into SSMS and Showplan in this demo intensive session. You will walk out of this session equipped to make Query Performance Troubleshooting easier and ready to tackle those nasty, hard-to-solve query plans.
Kusto is a service for storing and running interactive analytics over Big Data. Kusto was designed from scratch to be a “big data” repository for Azure and easy to query using Kusto Query Language (KQL). As we progress in our migration to the cloud, we learn new ways to monitor and alert resources and services. Microsoft has consolidated many services under the umbrella of ‘Azure Monitor.’ Whether you are detecting and diagnose issues across applications and dependencies,’ ‘correlate infrastructure issues with Azure Monitor for VMs,’ ‘drill into your SQL database monitoring data with Log Analytics, you will be using ‘Azure Monitor.’ Azure Monitor uses a version of the KQL used by Azure Data Explorer that is suitable for simple log queries and includes advanced functionality such as aggregations, joins, and intelligent analytics. As we advance, the KQL must be your primary resource for querying the Azure Monitor log.
This 95% demo session will show you some ‘getting started’ tips and a few sophisticated queries using KQL. I will do a live demo, generating an alert using KQL. I will dive into Kqlmagic, which brings you the benefit of notebooks, data analysis, and rich Python capabilities in the same location against Azure Log Analytics workspace. At the end of this session, beginners will have a solid knowledge about KQL that they can build upon by themselves; others will also learn many advanced operators, including machine learning ones. Come and learn about the future of monitoring and investigations of Azure services.
You migrate your on-premises SQL Database to the cloud, taking advantage of the PaaS offering of Azure SQL Database. You heard the promise of spinning up databases on-demand and scale up resources during high peak and scale down when not in use. You also want to ensure you perform integrity checks, index defragmentation, and statistics updates when necessary. There is no SQL Agent, so how do you automate your jobs? Do you have time to do this manually each time? No. There are different options available to automate these long-running, manual, error-prone, and frequently repeated tasks to increase efficiency.
In this demo intensive session, I will show you different options on how to automate these tasks. Some of these solutions using on-prem infrastructure or services in Azure are conveniently encapsulated within the common Azure portal experience.
At the end of this session, you will have a solid understanding of how to automate SQL Server Maintenance tasks, including replacing SQL Agent functionality with multiple options.
Azure SQL Database is a relational database-as-a-service (DBaaS) based on the Microsoft SQL Server Database Engine's latest version. You can build data-driven applications and websites in the programming language of your choice without needing to manage infrastructure.
While considering Azure SQL Database, two of the most critical questions you need to answer are my options for business continuity and how to implement these options?
In this demo intensive session, I will discuss five specific options and show you how to implement those. You will walk out of this session equipped with knowledge and scripts to implement appropriate High Availability and Disaster Recovery solutions for your Azure SQL Database.
Notebooks in Azure Data Studio are getting popular each day as Microsoft is releasing new Kernels. Parameterizing notebook allows you to run the same notebook with a different set of values. Which eventually makes automation easier.
This 100% demo session will show you three different ways to parameterize notebooks in the Azure data studio. Papermill python library, URI parameterization, and kql magic are the three methods we will explore in this session.