
Dustin Dorsey
Sr. Cloud Data Architect, Onix
Nashville, Tennessee, United States
Actions
Dustin is currently a Principal Data Architect at Onix where he leads data and analytic initiatives for some of the most innovative companies in the world. He has over 16 years of experience working with data across various industries, serving in diverse roles such as administration, development, analytics, and leadership. He is a community leader who loves learning from and teaching others which has led him to organize tech events and meetups for the past decade. He is one of the founders of the Nashville Data Engineering group and DataTune Nashville. Dustin is also co-author of “Pro Database Migration to Azure” and "Unlocking dbt" and is a regular speaker at data events around the world.
Links
Area of Expertise
Topics
Managing Your Data Services Spend with Azure Cost Management
Cloud technologies provide us so many benefits to get us what we need very quickly with its extensive library of services, easy to deploy infrastructure, and flexibility to scale. However, without proper control and monitoring, you can end up with some unpleasant surprises when your bill arrives. In this session, I will dive through some of the amazing features available in Azure Cost Management and show you how to use it to avoid this and maximize your Azure data services spend. We will first look at the critical role that planning and proper governance play in managing your costs, and then transition into a demo heavy look at Azure Cost Management in the portal. Here I will show you how to analyze spends, effectively use budgets, create alerting, and provide several tips and remediation steps on things you should look for that can signal overspends. I will also show you how you can use this tool to showcase the immediate financial savings from activities such as performance tuning and optimization. As data professionals, it is easy for us to just focus on the technology and cast aside items such as cost management, but this is an important aspect that your knowledge and expertise is essential to the success of.
Introduction to Data Vault
With the large volume of data and data sources that data professionals work with today, it can often make sense to incorporate a data modelling technique into the integration layer of your data warehouse. This is where Data Vault comes in to help. Data Vault is a data modeling technique that provides a flexible framework which sits between the staging layer and the data mart layer in the data warehouse. This technique may be right for you if you are looking to integrate data from many sources quickly, reduce complexity in the raw data layer, or need a flexible way to handle frequently changing relationships. In this session we will introduce you to the core concepts of Data Vault and practical use cases for it.
Improving Financial Efficiencies in Your Azure Environment
Whether you are just getting started with Azure or have been working with it for years, there is a good chance you are spending more on your run rate than you should be. You may even be part of an organization who is considering staying or transitioning back to on-premises because your business is just not seeing the savings they were expecting. As technical and data professionals we play a significant role in locating, remediating, and communicating these issues by applying our technical expertise around the services. This is especially true for data processing and storage in the cloud which is often our most expensive costs and biggest offenders. Focusing on the data side, I will show you how to improve financial efficiency all the way from building accurate budgets up to controlling costs once you are already in the cloud. We will look at some of the often overlooked services, programs, and tools Microsoft offers us and how to effectively use them to lower our costs. Become a hero in your organization by learning several ways you can lower your Azure costs.
How Failing to Plan is Planning for Consultants
In this session, we'll dive into the critical aspects of strategic data modeling in dbt and highlight the pitfalls of neglecting proper planning. Many dbt users jump straight into building data models without considering important factors such as data source dependencies, reusability, documentation, testing, scalability, and performance. This can result in technical debt and costly mistakes that often require consultants to clean up.
Through real-world examples, we'll explore the consequences of failing to plan ahead in dbt and share practical strategies for avoiding common pitfalls. Whether you're an experienced dbt user or just starting out, this session will provide you with actionable insights to strategically plan and architect your data models in dbt for success, preventing potential headaches and the need for external consultants. Join us to learn how proper planning in dbt can save you time, effort, and resources in the long run, and ensure a smooth and efficient data engineering process.
dbt for the SQL Developer and DBA
With the emergence of cloud technology and boom of new data services hitting the market many data professionals including developers and database administrators are finding that they need to expand on their existing skills to fulfil business needs or compete for top jobs in the market. This doesn’t mean you need to uproot the skills you already have to find success; it just means you need to evolve them. This is where dbt comes in. dbt takes the SQL skills you already have and builds amazing functionality on top of it using Jinja presenting one of the easiest learning curves for SQL professionals to advance their career in analytics. In this session, I will take my years of experience as a dedicated DBA and SQL Developer and help bridge the gap between what you know and love about SQL to dbt by speaking your language. I will show you how easy it is to get started and look deep at the advantages dbt presents over utilizing inbuilt programming features like stored procedures and functions that you may be accustomed to.
De-Risk Your Database Move to Azure: Overcoming Business and Technical Roadblocks
Migrating an existing on-premises application to the cloud can be a daunting task that consists of many complicated steps. In this full day session, we will cover critical considerations around migrating your workloads and discuss both the technical and business roadblocks that commonly challenge this process and how you can overcome them. We will go from beginning to end starting with the pre-planning that must occur and move into discussing the actual migration and post-migration elements of the project looking at the challenges along the way. While we will review the technical aspect, this is more then just a step-by-step session. Its a practical look at real world issues companies are being challenged with today and what you can do to resolve them. Topics such as budgets and costing, architectural decision making, security, team success factors, performance considerations and so much more. Join us to gain the skills you need to de-risk migrating your data workloads to the cloud!
DataOps Meets FinOps: Driving Efficiency and Cost Optimization With dbt
As organizations strive to maximize the value of their data, managing costs (FinOps) is a critical aspect of data operations. In this session, we will explore how dbt can be leveraged to optimize cost efficiency in data operations, while delivering high-quality data outcomes.
We will delve into advanced techniques for using dbt to drive cost optimization in data modeling, transformation, and delivery workflows. Topics will include strategies for optimizing performance, materializations, caching, and monitoring in dbt to ensure cost effectiveness. Real-world examples and practical tips will be shared to help attendees align their DataOps workflows with FinOps principles.
Join us to learn how dbt can be utilized as a powerful tool for driving cost efficiency in data operations, and gain insights to optimize your organization's data processes from a FinOps perspective. Whether you are a data engineer, analyst, or data leader, this session will provide valuable strategies to maximize cost optimization in your dbt workflows, ultimately leading to better data outcomes and improved bottom-line results.
Considerations for Building your First Cloud Data Warehouse
Whether you are looking at migrating from an existing data warehouse or your company is working down the path of building their first, a cloud solution should be a top consideration. The ever-evolving cloud puts limitless analytic capabilities within just a few clicks so that we can focus on building the solutions needed to answer critical business questions. However, it can get really overwhelming with the number of solutions out there so where do you start and how do you figure out what is best for you? In this session, we will look at the top considerations needed to be able to answer that question. We will look at all the major cloud providers, the main solutions they offer, common architectural approaches, and the factors that may influence your decision on each of these. This session will not a technical deep dive, but a guide to help point you the right direction for your next data warehouse.
Building Data Transformations with dbt
dbt is a free and easy to use data transformation tool that is exploding in popularity, transforming the landscape of data and analytics platforms across the world. The transformation (T) step in your extract-load-transform (ELT) process can often be seen a bottleneck, requiring specialized skills, rigorous testing and an extensive deployment process to handle even minor changes. dbt focuses on the transformation step, making it super easy to transform data already in your database or data warehouse. No specialized skills required; all you need with dbt is to write a Select statement. In other words, dbt takes the SQL skills that data analysts and engineers are already comfortable with and enables them to build their own analytics engineering workflows. This session is intended for anyone new to dbt who wants to know if it is right fit for their organization. We will cover the dbt viewpoint, the fundamental uses of the tool, and demo several features so that you can get started right away creating automated analytics engineering workflows.
An In-Depth Look at Synapse SQL Pool vs SQL Server
One of the primary components of Azure Synapse Analytics is the Synapse SQL Pool which is the back-end SQL data store that is used to host your traditional data warehouse. When you first connect through SQL querying tools, it has the look and feel of a familiar SQL database. However, after further investigation you quickly begin to realize there are quite a few differences. For the those experienced with SQL Server this can take some time to adapt since since many of the tools, techniques, or scripts you have grown accustomed to just do not work. In this session, I want to take what you already know and love about traditional SQL Server and compare it to what Synapse SQL Pool offers to assist in your transition. I will walk you through several of the differences and show you what I wish I knew when I started using SQL Pools. We will look at the unique way architecture and use-cases play a role in defining some of these differences and then go into the details of how T-SQL, security, performance tuning, and monitoring vary from what you may be used to. By the end of this session, you will understand key differences and be ready to get started using Synapse SQL Pools immediately.
Building a Dimensional Model Data Warehouse with dbt
For nearly 30 years companies have been using the dimensional model (or Star Schema approach) to construct their data warehouses with great success. While technology and techniques have greatly changed during that time the dimensional model design is still very widely used and relevant today. In this session, I will share with you why and then dive into the technical details of how dbt can be used to support this. We will look at how to structure your project, redesign considerations to work within the dbt framework, and walk-through live demos of building the most common types of fact and dimension tables. While we won’t be able to cover every aspect, you will have the blueprint needed to get started immediately.
Optimizing Azure Databases for Cost Control
Cloud technologies provide us so many benefits to get us what we need very quickly with its extensive library of services, easy to deploy infrastructure, and flexibility to scale. However, without proper control and monitoring, you can end up with some unexpected surprises when your bill arrives. In this session, we want to share with you tips and tools that you can use to avoid that. We will first look at the critical role that planning and proper governance play in managing your costs, and then transition into a demo heavy look at the Azure Cost Management of the portal. Here I will show how to analyze spends, effectively use budgets, create alerts, and utilize the Azure Advisor recommendations so you can start creating more visibility and accountability within your organization. Data professionals are often overlooked when management is making budgetary decisions, but by learning these tools and features now, you will be in much better position to influence key decisions about your data platform strategy.
Knowing and Controlling Your Azure Spend
Cloud technologies provide us so many benefits to get us what we need very quickly with its extensive library of services, easy to deploy infrastructure and flexibility to scale. However, without proper planning, control and monitoring you can end up with some unexpected surprises when your bill arrives that can jeopardize your project. As technical professionals it’s easy to get focused on the technology and not consider items such as cost management, but you are essential to successfully creating a cost efficient system. In this session, we will review the primary components of budgeting and cost management, review the tools and programs in place to assist you and provide several tips based on years of experience to help get you started.
How to Budget for an Azure Migration
You have heard all about the awesome opportunities that utilizing cloud technology provides and you are sold! Technically, everyone agrees it's the right answer and are ready to get started. However, there is one problem. Your organization needs to understand what its going to cost before approving the project. Understanding the cost to move to Azure and its run rate are questions several companies struggle with and can present a major roadblock for moving forward. Oftentimes it's not because the cost is too high, its simply because they do not know what to expect and budget for. In this session, we will walk through the process of building estimates, creating a Azure budget and discuss the several unique considerations you need to be mindful of such as networking, storage, service dependencies, workload considerations and more. As you explore opportunities to migrate to the cloud, don't let cost be a deterring factor.
Baselining Performance with DEA
The Database Experimentation Assistant (DEA) is a A/B testing tool that Microsoft offers that allows you to evaluate differences in performance from a source and target SQL server. It does this by allowing you to capture a trace on an existing server and replay the trace back on a new server to see how it impacts query and resource performance. This is incredibly valuable when you are planning SQL Server upgrades and want to see how new features and changes to the optimizer affect your queries, but this is also valuable when you are trying to compare hardware or planning a move to Azure services.
In this session, I will introduce you to this amazing and underutilized tool and show you how and when you should use it. We will also look at the built-in analysis and comparison reports it provides and how to apply it. And finally, we will talk through several considerations you need to be mindful of when using this.
Determining the Cost to Move to Azure
You have read all about moving to the cloud, spoken to colleagues who are on it and even attended events where they told you about how great it is, and YOU ARE SOLD! You recognize the benefits of storing your workloads in the cloud and ready for things to happen. Everyone agrees its the right answer, however, there is one problem. Your organization need to understand the expected costs before moving forward. Understanding the cost to move to Azure and its run rate are questions several companies struggle with and has led to several choosing to stay in their current state. It is not because the cost is too high as some may expect, but oftentimes because simply no one has any idea what to expect. In this session, I will walk you through steps that show you how to estimate the cost of your move to Azure and provide several unique considerations along the way that can have an impact on this. Next time someone asks how much, you will be ready!
Costly Mistakes You Are Making With Your SQL Licenses
If you are running SQL Server in a production environment, then you know that licensing is not cheap. SQL licensing can be confusing, seems to change often, and can be hard to navigate even for the most seasoned professionals. This session will dive into the top reasons your company may be overpaying (or underpaying) on their licenses and provide you with practical steps you can do to discover and fix those issues. This session is intended for anyone who is responsible for managing, allocating or advising on SQL licensing.
It’s not my code running slow, it’s your server!
Having worked on development teams for years, this is something that I have heard more times than I can count. A developer believes bad performance is caused by the infrastructure and an Admin believes that bad performance is caused by the code, but whose right? The answer is it could be either one or a combination of both, but you have to know where to look.
In this session, I will show you how you can use server wait stats in SQL Server to provide insight into why your servers may be running slow. We will start by looking at what role server resources (CPU, Memory, Network and Disks) play in the big picture of query processing and show you how you can use server wait stats to identify bottlenecks with each of these. We will also review common server waits that can occur at each crossroads of a query, reasons you may be having them and what you can do to solve the problem. Being able to look at the wait stats and interpret the results should assist you in answering the question of is it the server or the code, or both!
The goal for this session is that you walk away with a better understanding of what wait stats are and how you can put them to use immediately to figure out your performance issues.
Exploring the latest T-SQL enhancements
With the most recent releases of SQL Server, there are several enhancements that were introduced for the T-SQL language. This includes new string functions, bulk access options and approximate query processing among others that address several challenges that SQL developers have to confront. In this demo heavy session, we will go over several of the new features, provide a demonstration of how you can use them, and share any considerations that you may need to consider before doing so. This session is intended for anyone interested in what’s new in the T-SQL language and will focus primarily just on enhancements to traditional T-SQL for the database engine.
Azure Cost Management
One of the benefits of the cloud is the ability to be able to create resources quickly and on-demand as you need them. However, without proper control and monitoring, this can quickly generate some not-so-pleasant surprises when your bill arrives. Cost management used to be something that data professionals rarely had to think about, but the cloud is changing that. Unlike on-premise environments, every decision in the cloud we make to deploy or scale comes at a cost that has to be considered.
In this session, we want to show you how you can create governance around your Azure environment to avoid these surprises and start getting the most from your spend while still being able to maintain the flexibility that makes the cloud so valuable. We will discuss how you can allocate costs to specific areas and create more accountability in your organization using things such as tagging. We will also dive heavily into the Azure Cost Management portion of the portal and talk about analyzing spends, how you can create more visibility to the organization, creating budgets and things you can do with them, alerting around billing\spends, and how to review and interpret the cost saving recommendations that Azure provides. We will also discuss how you can use Azure Consumption Insights to start building your own reports in PowerBI using this data.
SQLSaturday Memphis 2022
Keynote - The Evolution of the Data Professional
Data Platform Summit
Modernizing Data Transformations with dbt
PASS Data Community Summit 2021 Sessionize Event
Data Platform Summit
Knowing and Controlling Your Azure Spend
New Stars of Data 2021 Sessionize Event
New Stars of Data (mentor)
Mentor for new speaker
SentryOne Accelerate 2020
Knowing and Controlling Your Cloud Spend w\Mike Wood
Cloud Migration Panel with Kevin Kline, John Sterrett, Erin Stellato and Patrick Kelley
SQLSaturday Memphis 2020
How to Budget for a Azure Migration (general session)
SQLBits 2020
Azure Cost Management (General Session)
Baselining Performance with DEA (General Session)
SentryOne Webinar
Optimizing Azure Databases for Cost Control
Code PaLOUsa 2020 Sessionize Event
SentryOne Webinar
Managing Cloud Costs: Budgeting for your migration
Redgate Community Circle
Why it's vitally important to use a Platform as a Service as a part of your cloud solution for data management
SQLSaturday Tampa 2020
Azure Cost Management (Extended Session)
Nashville SQL User Group
Determining the Cost to Move to Azure
SentryOne Webinar
The Top 4 Data Technology Trends in 2020 Every Data Professional Should Know
SQLSaturday Nashville 2020
Determining the Cost to Move to Azure (General Session)
SQLSaturday Nashville 2020
SQL Server Fundamentals: The Absolute Beginner's Guide to Querying Data (Full-day Pre-con)
SQLSaturday Memphis
Exploring the Latest T-SQL Enhancements
Nashville SQL User Group
Using Wait Stats to Determine Why My Server is Slow
Music City Tech 2019 Sessionize Event
SQLSaturday Indianapolis 2019
Managing the Mystery Database
SentryOne Webinar
Consolidating Your SQL Server Environment to the Cloud
SQLSaturday Louisville 2019
10 reasons you are paying too much (or too little) on your SQL licenses
SQLSaturday Birmingham 2019
Exploring the latest T-SQL enhancements
SQLSaturday Austin 2019
Managing the Mystery Database
SQLSaturday Pensacola 2019
Exploring the latest T-SQL enhancements
SQLSaturday Chattanooga 2019
Using wait stats to determine why my server is slow
SQLSaturday Cincinnati 2019
Understanding and making good decisions on SQL licensing
SQLSaturday Nashville 2019
Understanding and making good decisions on SQL licensing

Dustin Dorsey
Sr. Cloud Data Architect, Onix
Nashville, Tennessee, United States
Links
Actions
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top