
Taiob Ali
Microsoft Data Platform MVP | Global Data Solutions Leader | Cloud & AI Advocate
Boston, Massachusetts, United States
Actions
I'm Taiob Ali, a Microsoft Data Platform MVP with over 19 years of experience architecting and managing data solutions across the finance, e-commerce, and healthcare sectors. Currently, I serve as the Database Solutions Manager at GMO LLC, leading a global team across three continents to drive cloud migrations, automation, and operational excellence.
My expertise spans the Microsoft Data Platform, MongoDB, and emerging technologies like Azure AI and Python for data-driven decision-making. I'm passionate about sharing knowledge and have presented at over 100 events worldwide, including SQL Saturdays, Data Saturdays, and international conferences.
As a community leader, I founded the Database Professionals Virtual Meetup Group, served on the New England SQL Server User Group board, and organized Boston SQL Saturday. My contributions to Microsoft Learn have been recognized in the Contributor Stories.
Area of Expertise
Topics
SQL Database in Microsoft Fabric: Redefining Modern Data Workflows
With the introduction of Microsoft Fabric, a new flavor of SQL Database has emerged—built to power unified data experiences across analytics, AI, and business intelligence. But what sets it apart from traditional SQL Server or Azure SQL Database?
In this session, we’ll unpack the rationale behind this new SQL engine and explore its advantages for data professionals. You’ll learn how SQL in Fabric simplifies data integration, enhances collaboration through native support for Notebooks and Power BI, and extends analytics possibilities with built-in support for Azure OpenAI and vector-based queries.
Expect a practical walkthrough: from initial setup and data ingestion to generating insights with Power BI and monitoring performance. We'll also explore how Fabric’s SQL engine fits into the broader data architecture—and whether DBAs and data engineers need to “retrain” or simply “retool.”
PostgreSQL vs. SQL Server: Security Model Differences
Security is paramount in database management. If you are an SQL Server expert looking to learn PostgreSQL, you'll need to understand how PostgreSQL's security model differs from SQL Server's. This talk will compare the security models of the two database systems. Aimed at database administrators and developers, the talk will highlight the key differences in how these systems handle user authentication, roles, and permissions.
For example, did you know that:
-SQL Server distinguishes between logins and users, whereas PostgreSQL uses a unified role-based system for authentication and authorization.
-SQL Server offers predefined server and database roles, such as sysadmin, which provides a range of out-of-the-box permissions. Conversely, PostgreSQL includes default roles like pg_read_all_data, designed to simplify standard permission sets.
-SQL Server allows the creation of custom roles with flexible permission assignments. PostgreSQL's roles enable inheriting permissions from other roles and support complex role hierarchies.
Understanding these differences and others covered during the session will help you understand the security model differences between SQL Server and PostgreSQL, enabling you to implement security best practices effectively in either environment.
Mastering Kusto Query Language (KQL): Your Gateway to Azure Monitoring and Insights
As organizations move workloads to Azure, visibility into resource performance, security, and operations becomes essential. Azure Monitor is Microsoft’s centralized platform for observability, bringing together telemetry across applications, infrastructure, and services. At the core of this platform is Kusto Query Language (KQL)—a fast, expressive language purpose-built for exploring log data and enabling deep analytics.
This session is your hands-on introduction to KQL, designed for cloud professionals who want to gain actionable insights from Azure Monitor Logs. Through live demos and practical examples, we’ll walk through the core concepts and real-world applications of KQL, including:
Understanding KQL syntax and how it compares to T-SQL
Querying and analyzing log data using Azure Log Analytics
Efficient techniques for filtering, sorting, and summarizing data
Building alerts and identifying anomalies with KQL-based queries
Using joins, time-series functions, and advanced features for deeper analysis
By the end of the session, you'll be equipped to confidently write and use KQL queries to enhance monitoring, security, and operational decision-making in Azure environments.
🎓 Prerequisites:
Basic understanding of SQL or T-SQL
General familiarity with Azure services
360-degree Overview of Backup and Restore
If you are the database steward, your most critical task is to guarantee that all committed transactions are always recoverable during a disaster within acceptable limits for data loss and downtimes.
Achieving this can be simple by taking a full backup or complex, which might include filegroup backups based on the size and criticality of your application data.
Whatever your situation is, being well-prepared and practicing with your tools, scripts, and strategy will ensure you can respond quickly and efficiently when a disaster happens.
In this session, I will teach you all the basic types of backups and how to create backups and restores using SSMS and TSQL. Then we will move to advanced techniques, discussing file and filegroup backups, partial database restore, and T-SQL snapshot backups introduced with SQL Server 2022.
At the end of the session, you'll be able to create a solid Backup and Restore strategy around the agreed service level agreement with your business counterpart.
Leveraging Azure AI and Python for Data-Driven Decision Making
In this technical talk, we will explore how to harness the power of Azure AI, Azure AI Studio, Azure Search Services, and large language models to extract valuable decision-making data from the Azure SQL Database.
We will begin by discussing Azure AI and its capabilities. Starting with a clean slate, build a solution using Azure AI Studio and its user-friendly interface that can chat with an SQL database, helping make data-driven decisions without writing code. This solution will delve into Azure Search Services, highlighting how it can be used to efficiently index and query data.
The second part of the presentation will focus on utilizing large language models and Python notebooks to extract and analyze data from the Azure SQL Database. Attendees will learn how to set up their environment, connect to the database, and implement AI-driven solutions (talk to the database).
By the end of the session, participants will have a solid foundation in using Azure AI and Python for data-driven decision-making, empowering them to leverage these tools in their projects.
(Lightning talk) Break five common myths about SQL Server backup
In this session, I will discuss five common myths about SQL Server backup and demo each of them to show you the correct answer.
1. Does Full and Differential backup break the log chain?
2. Are Differential backups incremental?
3. What backups are allowed on system databases?
4 Is transactional backup necessary during full backup?
5. Does backup use a buffer pool to read data pages?
(Lightning Talk) Analyzing Azure Monitor Log data in Azure Data Studio
How do you consume the data once you enable event logging (auditing and diagnostic) for the Azure SQL database? How do you find anomalies to alert on, establish a baseline, look at trends?
In this 100% demo session, I will show you kql magic, kql kernel, and the recently released Azure Monitor Logs extension for Azure Data Studio to consume event logs of the Azure SQL database.
Think like the Cardinality Estimator
SQL Server uses a phase during query optimization, called cardinality estimation (CE). This process makes estimates based on the statistics as to how many rows flow from one query plan iterator to the next. Knowing how CE generates these numbers will enable you to write better TSQL code and in turn, influence the type of physical operations during query execution. Based on that estimated rows, the query processor decides how to access an object, which physical join to use, and how to sort the data. Do you know how the CE generates these numbers? If your query has only one predicate, the query optimizer will use the histogram to estimate how many rows will be qualified. What happens when you have multiple predicates, range predicates, variable values that are “NOT KNOWN” to the optimizer, or you have predicate values increasing in ascending order? Do you know what will happen if your predicate is using an amount that is outside of the histogram range?
In this session, I will show you how the cardinality estimator estimates in all of these scenarios. You will walk out of this session with a clear understanding of how the CE generates its numbers and is ready to tackle those nasty, hard-to-solve query plans.
What the heck is a checkpoint, and why should I care?
In SQL Server, a checkpoint is an internal process that writes dirty pages and transaction log records from memory to disk and marks a point in the transaction log. An 8K page is the fundamental data storage unit in SQL Server.
SQL Server performs every data modification operation in memory (buffer pool) for performance reasons and does not immediately write it back to disk.
This is where checkpoints come into play. There are four types of checkpoints, automatic, indirect, manual, and internal. The Database Engine periodically issues a checkpoint on each database based on the current setting to help reduce the recovery time of a given database from unexpected shutdown to system failure.
This session will explain why you should care and know about the checkpoint process and the different checkpoints that SQL Server does. I will show you exactly what happens during a checkpoint, how you can influence the interval of checkpoints, and changes made with checkpoint settings in SQL 2014 and SQL 2016+.
Considerations for migrating SQL databases to Azure
Many tools are available to migrate your on-premises database to an Azure SQL database. Are you familiar with all those tools, and how do you choose the best tool? How do you analyze and identify what objects are incompatible with migrating to Azure? Answer: It depends (of course) on the database type, size, and complexity you will be relocating.
This session will explore considerations before migration, appropriate targets, migration tools available, and the pros and cons of each tool. I will demo four tools you can use to analyze/migrate your on-premises SQL Server Database to Azure SQL.
At the end of this session, you will know the various techniques available to analyze and migrate SQL Database to Azure and choose the best fitting one for your database.
Azure SQL Database - Where is my SQL Agent?
You migrate your on-premises SQL Database to the cloud, taking advantage of the PaaS offering of Azure SQL Database because SQL Managed Instance seems a bit too much for what you require. You heard the promise of spinning up databases on-demand, scaling up resources during high peaks, and scaling down when unused. You also want to ensure you perform integrity checks, index defragmentation, and statistics updates when necessary. But you opted for an offering with no SQL Agent, so how do you automate your jobs?
Do you have time to do this manually each time? No. Different options are available to automate these long-running, manual, error-prone, and frequently repeated tasks to increase efficiency.
In this session, which is full of demos, I will show you six ways to automate these tasks. Some of these solutions use your own infrastructure or Azure services that you can access from the Azure portal experience.
At the end of this session, you will have a good understanding of how to automate Azure SQL Database Maintenance tasks, including replacing SQL Agent functionality with multiple options.
Prerequisites:
Familiarity with Azure SQL Database (PaaS). Understand what SQL Server Agent does for on-premises SQL Servers.
Goals:
1. You will learn six options to automate maintenance tasks against Azure SQL Server (PaaS).
2. Once you familiarize yourself with the different options, you can compare them and decide which suits your environment.
The magnificent seven and beyond- Intelligent Query Processing in SQL Server
Can we enhance query performance without any code changes? Modifying applications can be an expensive endeavor or completely beyond your control. Therefore, developers and DBAs prefer that the query processor adapts to their workload requirements rather than relying on options and trace flags to improve performance. Adaptation is the foundational concept behind Intelligent Query Processing (IQP) in the latest versions of SQL Server. This demo-intensive presentation will explore the fifteen intelligent query processing features introduced in SQL Server 2022, 2019, and 2017. For each of these fifteen features, we will examine the issue it aims to resolve and the algorithm it uses to tackle the problem. We will evaluate the pros and cons of using these features. You will learn how to deploy them at various scopes tailored to your specific needs, such as server, database, session, or query levels. You will also be able to identify the features built on the Query Store.
Attending this session will allow you to learn about the new capabilities of intelligent query processing and gain powerful tools to persuade your peers to upgrade SQL Server and databases to the latest build, both on-premises and in the cloud.
Prerequisites:
A basic understanding of query processing, familiarity with estimated and actual execution plans, and knowledge of how to measure query execution performance are required.
Goals:
1. Explore the Intelligent Query Processing features introduced since SQL Server 2017.
2. Understand the issues each feature addresses and how it resolves them.
3. Learn how to turn these features on or off at the server, database, session, and query levels.
Lifting Your Data Skills to the Cloud
Ninety percent of enterprises utilize cloud services, 67% of enterprise infrastructure is now cloud-based, and 86% of businesses employ a multi-cloud strategy. What does that mean for data professionals who have worked with on-premises technology for years? Do we have to relearn everything from ground zero? How can we leverage the knowledge and experience we have acquired over the years and apply it to the cloud?
If you are thinking about these questions, this session is for you. Many responsibilities have become shared between database administrators and the cloud provider. For example, in the PaaS model, the cloud provider will take a database backup, while the database administrators are responsible for setting up long-term retention.
We will discuss these critical areas of database administration (System provisioning, Data Migration, Data Security, Performance tuning and monitoring, Disaster Recovery, High Availability, Backup and Recovery, and Cost management). I will show/explain what existing skills we will need and what new ones you will need to learn. I will also cover tools provided by cloud providers that you will need to learn and use.
This session will equip attendees with the necessary information to successfully administer databases in the cloud.
An attendee will walk away knowing three things:
1. Translating On-Premises Expertise to the Cloud.
2. For each critical area of database administration, a list of new skills is required to be a successful "Cloud DBA"?
3. What tools can Cloud DBAs leverage that the provider provides (in this case, Microsoft Azure)?
SQL Server Detective: Investigating Logs and Traces for Optimal Performance
Your role as a DBA is crucial, as a pivotal figure in the realm of Microsoft SQL Server. Your ability to monitor and analyze your server’s performance and troubleshoot issues is essential, and it's what keeps the system running smoothly. One of the most effective ways to do this is by utilizing the built-in data collection, which records a wealth of information.
This session will delve into six collections (Logs and traces) that can provide valuable insights into your server’s configuration, health, and performance. By the end of this session, you will be equipped with practical knowledge that will empower you to handle any SQL Server issue confidently.
• Default trace
• SQL Error log
• SQL Agent Error log
• System_health
• AlwaysOn_health
• Telemetry xEvents
I will show you each one's default location, retention, and content captured. We will discuss how to access and interpret these logs and demonstrate how they can be used to identify and diagnose issues such as failed logins and deadlocks, the health of the Windows cluster and availability group, and schema changes. By effectively utilizing the SQL Blackbox, you can ensure the smooth operation of your server and maintain the highest performance and reliability.
Prerequisite: Familiar with SQL Server administration
Goals:
1. Know the default location, retention, and content of the six built-in SQL Server logs/collections.
2. Know how information from these collections can be extracted for different troubleshooting scenarios and ensure reliable and highly available database systems.
(Lightning talk) go-sqlcmd: A CLI for SQL Server and Azure SQL
go-sqlcmd is a powerful open-source, cross-platform, command-line tool with no ODBC dependency. It maintains all the features of the traditional SQLCMD utility while introducing enhancements. For example, it allows you to create a local SQL Server instance with a sample database with one line of code.
The Go package (go-mssqldb driver) is designed to seamlessly interact with Microsoft SQL Server, Azure SQL Database, and Azure Synapse. It lets users enter Transact-SQL statements, system procedures, and script files at the command prompt. Whether you’re a seasoned database administrator or a developer, go-sqlcmd streamlines your SQL-related tasks, making it an essential addition to your toolkit.
Join me in this lightning talk to discover essential features and how go-sqlcmd simplifies SQL interactions and empowers your database workflows!
Prerequisite: None
Goals:
1. Learn about the essential functions of go-sqlcmd
2. How to install and query SQL instance using go-sqlcmd
3. How to manage context
Azure Database for PostgreSQL: 15 Essential Standards for Compliance and Security
Our team recently inherited multiple Azure Databases for PostgreSQL and discovered the lack of uniform implementation for essential items such as Private Endpoints, Authentication Methods, Backup Retention, Diagnostics Settings, Compute Type, and High Availability.
Uniform standard implementation at the workplace for your PostgreSQL database (even when hosted in Azure) ensures compliance with regulations like GDPR, SOX, PCI DSS, and HIPAA. These regulations ensure institutions maintain robust security measures and audit trails to protect sensitive data and comply with legal requirements, especially in industries like finance and healthcare. Failure to comply with industry regulations can lead to monetary fines, civil penalties, operational restrictions, and, most importantly, reputation damage.
In this session, you will receive a practical checklist of fifteen standards, their importance explained, and actionable insights about implementing them across your company.
Prerequisite:
Familiarity with the features available with Azure Database for PostgreSQL.
Goals:
1. Standardize PostgreSQL Database Implementations in Azure.
2. Improve Compliance with Industry Regulations.
3. Enhance Security and Operational Efficiency.
POSETTE: An Event for Postgres 2025 Sessionize Event Upcoming
SQL Saturday New York City 2025 Sessionize Event Upcoming
Cloud Data Driven User Group - 2025 Virtual Sessions User group Sessionize Event Upcoming
Capital Area SQL Server User Group User group Sessionize Event Upcoming
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top