Session
How To Tune A Multi-Terabyte Database For Optimum Performance
Are you responsible for one or more very large databases that are having performance issues or seem to be running slowly? Perhaps you’re having trouble completing index maintenance, or the backups and restores are taking an unusually long time? In this session, we will explore the practical strategies for tuning multi-terabyte SQL Server databases. We will review how to solve performance challenges, including slow queries, lengthy backup times, and inefficient storage configurations. Through real-world examples, performance metrics, and SQL scripts, the session demonstrates how to improve throughput, optimize file and index management, and implement compression and partitioning techniques.
Session Goals
Diagnose and resolve performance bottlenecks in large-scale SQL Server environments using tools like diskspd and Crystal Disk Mark.
Design and implement optimized storage architectures, including filegroups, partitioning, and compression strategies.
Measure and improve I/O throughput and latency through configuration tuning and infrastructure collaboration.
Prerequisites
Participants should have:
Basic SQL Server administration experience, including familiarity with database files, indexes, and query execution.
Understanding of storage and infrastructure concepts, such as filegroups, SANs, and I/O performance.
Comfort with performance monitoring tools and interpreting metrics like latency, throughput, and IOPS.

Jeff Taylor
Principal Data Consultant
Jacksonville, Florida, United States
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top