Speaker

Rob Sewell

Rob Sewell

Be-Whiskered Automator of Things

Exeter, United Kingdom

Rob was a SQL Server DBA with a passion for PowerShell, Azure, Automation, and SQL (PaaS geddit?). Now he just helps people make things. He is a Cloud and Data Center MVP and a Data Platform MVP, SQLBits committee member, PSConf EU organiser, and co-leader of Data South West UK User Group. H has spoken at and volunteered at many Data and PowerShell events all over the world. He is a proud supporter of the Data and PowerShell communities.

He relishes sharing and learning and can be found doing both via Twitter and his blog. He spends most of his time looking at a screen and loves to solve problems. He knows that looking at a screen so much is bad for him because his wife tells him so. Thus, he is often out riding his bike as his eyes are not good enough to play cricket any more.

He has a fabulous beard

Awards

Area of Expertise

  • Information & Communications Technology

Topics

  • PowerShell
  • Azure DevOps
  • Azure Arc Data Services
  • Azure Bicep
  • Bicep
  • Infrastructure as Code

Why every project needs a dev container - an introduction

Dev containers are a VS Code feature that allow us to define the perfect development environment for our project.

This enables a definition of a working development environment for all of your team or your open-source contributors to be quickly and easily provisioned and updated. It will even work in a browser! No more worrying about the right hardware, software installations, dependencies, or updates. You can control it all with dev containers and make it simple and easy.

In this session we'll cover why dev containers can be so useful, show off an example we built to make contributing to dbachecks (an open-source module combining pester & dbatools to test our infrastructure) easy, and how you can easily add a dev container to your own projects.

A SQL environment for test or how we provided a dbatools training day at SQLBits using devcontainers

In March Jess Pomfret and I gave a Training Day for 5 million* attendees about dbatools.

For those that dont know, dbatools is the most popular SQL Server open source PowerShell module and has used blog posts from many of the SQL community as inspiration for over 900 million* commands written by over 4tn* contributors.

As we did not know even right up till the day whether we would be delivering together or separately, in person or remotely, we decided to put our demo into a repository and deliver it from a controlled environment (two containers) using GitHub Actions, Docker Hub and VS Code.

We learned some things along the way and had some fun also and we shall share this all with you.

Whilst this is about dbatools, PowerShell, containers, Pester and some fun. It will also show you how you can create a SQL Server environment which will enable you to repeatedly run demos, allow others to run the same code as you.

* It is highly likely that some numbers may have been exaggerated

Automate your DBA tasks with dbatools and dbachecks

Let us show you how easy it is to use PowerShell to accomplish many everyday tasks straight from the command line with the community developed modules dbatools and dbachecks.

Join Jess Pomfret and Rob Sewell, two of the dbatools in a Month of Lunches authors and take your dbatools and dbachecks skills to the next level or be introduced to dbatools and learn how effective it can make you. We will show you the skills, scripts, and tricks that we have learned with dbatools and dbachecks and tell you how they have rocketed our careers

There will be many practical examples of how dbatools and dbachecks can help to save time in DBA administration duties

- Creating Availability Groups
- Syncing Agent Jobs and Logins between replicas
- Finding SQL Instances on your estate
- Working with Central Management Server
- Disaster Recovery
- Backup Testing
- Simple instance migrations
- Complex instance migrations
- Tracking Activity
- Simplify application upgrades with database snapshots
- Working with other community tools with dbatools
- Encompassing dbatools in DevOps
- Validating your estate
- Using dbatools in the cloud (i.e. working with Azure SQL Database)

And much much more

Come and join us for a great day of PowerShell and SQL full of demos and useful solutions that you will be able to take back to your workplace. You will improve your knowledge and skills, see how the modern DBA will work and have fun at the same time.

Who Are You?

You are a DBA, senior or junior, looking after 1 or 100,000 instances, a developer who interacts with SQL Server, an accidental DBA or you have DevOps in your job title.
You are interested in improving your PowerShell skills for SQL Server or reducing the time you and your team spend administering SQL Server.
You have spent 1 day or 20 years working with SQL Server.
You know that you work in a field where automation is king and want to understand more of what is available.
You are interested in learning and improving your automation skillset.

What do you need?

All you need is yourself! We would recommend that you bring something to take notes with as well as a minimum.

All scripts and slides will be provided to you. If you wish to follow along with most of the demos you should have access to a machine with PowerShell v 5 or above with a SQL Server 2012 or above instance.

20 Productivity tools we use to make life easier

You are busy, we are busy, everyone is busy. These are the tools that two busy consultants use to improve their productivity.

We want to tell you about the tools that we use; that maximise our time, simplify our life, reduce our key strokes and generally make us appear more amazing. So that you can use them too.

Control your Cloud Data Deployments - Deploy your Azure Data Solutions with Bicep and Azure DevOps

You can go to the portal and click in the blades and manually create all of the resources that you need. Such as Azure SQL Databases, Azure SQL VMs, Azure Arc Enabled Managed Instances, Azure Databricks or Azure Data Factory for example. You will want to ensure that they are all logging to Log Analytics and have monitoring enabled, maybe you need all of them to have Private Endpoints set up.

But you will forget something, you will mistype something, you will need to create multiple environments and they will need to be the same. Manually creating things is time consuming and mistake prone.

Humans make mistakes.

Come and join me for the day and I will show you how to deploy your data infrastructure into Azure using Bicep and Azure DevOps. How to ensure that when you deploy your resources, you automatically include Log Analytics or monitoring or add Private Endpoints. I will share the lessons that I have learnt along the way doing this for my clients.

You will learn
• What Bicep is
• How to compose Bicep with VS Code
• How to create a pipeline in Azure DevOps to deploy the code
• How to add Logging and Monitoring to your Azure Data Platform
• How to create modules so that you can ensure that you deploy all of the resources that you require quickly, easily and repeatedly

You are
• Someone interested in Infrastructure as Code for cloud data resources
• Interested in learning new things
• Want to deploy consistently to Azure
• Never heard of Bicep or pipelines

Control your cloud data deployments - Deploy your Azure Data Solutions with Bicep and Azure DevOps

Lets migrate to the cloud.

You can go to the portal and click in the blades and manually create all of the resources that you need. Such as Azure SQL Databases, Azure SQL VMs, Azure Arc Enabled Managed Instances, Azure Databricks or Azure Data Factory for example. You will want to ensure that they are all logging to Log Analytics and have monitoring enabled, maybe you need all of them to have Private Endpoints set up.

But you will forget something, you will mistype something, you will need to create multiple environments and they will need to be the same. Manually creating things is time consuming and mistake prone.

Humans make mistakes.

Come and join me and I will show you how to deploy your data infrastructure into Azure using Bicep and Azure DevOps. How to ensure that when you deploy your resources, you automatically include Log Analytics or monitoring or add Private Endpoints.

You will learn
• What Bicep is
• How to compose Bicep with VS Code
• How to create a pipeline in Azure DevOps to deploy the code
• How to add Logging and Monitoring to your Azure Data Platform
• How to create modules so that you can ensure that you deploy all of the resources that you require quickly, easily and repeatedly

You are
• Someone interested in Infrastructure as Code for cloud data resources
• Interested in learning new things
• Want to deploy consistently to Azure
• Never heard of Bicep or pipelines

Think PowerShell and SQL is meh? Meet dbatools!

Come and join one of the creators of the most popular SQL Server open source PowerShell module and learn how we have used blog posts from many of the SQL community as inspiration for over 900 million* commands written by over 4tn* contributors.

Let me show you how easy it is to use PowerShell to accomplish many everyday tasks straight from the command line with the community developed module dbatools.

I will even demonstrate in containers using Jupyter Notebooks that you can try out for yourself on your own machine.

Afraid of PowerShell?
Worried its too complicated?
Don't want to go through the learning curve?

Let me show you with plenty of demos how easy and straight forward it can be and I will save you time starting Monday

Want to restore an entire servers databases to the latest available point in time of the backups in just one line?
Be useful to know the Last DBCC Check for your entire estate in only one line of code?
Need to test your restores but it's complicated?
and many many more
All this in a fast-paced, fun session
* Some numbers may have been exaggerated

Using PowerShell with Excel

You are often required to put data into Excel for sharing with business users. This is frequently a manual process involving copying and pasting and manipulating in Excel.

Come and lose the CTRL + C CTRL + V by learning how the ImportExcel module can help to automate this for you, including creating charts and conditionally formatting cells.

I ain't a Data Scientist - Why do I need Jupyter Notebooks?

Jupyter Notebooks were once the realm of Data Scientists. New releases of Azure Data Studio, Visual Studio Code, and .NET interactive tooling have brought this tooling into the Operational team's area. The biggest benefit of using Jupyter Notebooks is that you have your documentation, your code __and__ your results in the same source controllable document.

I will share with you all of the knowledge I have gained over the past 12 months implementing Jupyter Notebooks for Data Operation teams.

You will leave
- with a good understanding of possible use cases for Jupyter Notebooks
- being comfortable in using the different tooling
- with many examples that you can take back to work and start being effective immediately

Enabling collaboration with your team, simplifying common tasks, improving Incident Wash-Up meetings, creating run-books, easily creating code for others to use are some of the benefits that you will take away.
We will have fun as well.
Prerequisites: Working in an operational team supporting Data Platform systems.
Lack of pogonophobia!
Willingness to learn
We will use Azure Data Studio

Notebooks, PowerShell and Excel Automation

Everyone loves Excel. A common request that I hear is can you put that into Excel for me, please. I love automation and so have combined two of my favourite tools PowerShell and Notebooks. Using Jupyter Notebooks to accomplish regular tasks and run-books has become common over the past year.

A particular use-case that my clients have found very useful is to quickly create Excel sheets of information using Notebooks, PowerShell, and the ImportExcel module.

Let me share my experience with you and show you how to create your own Notebooks for your users and the tactics I have learned to enable users to self-service and create their own Excel Workbooks.

Azure Arc-Enabled Data Services Introduction

Azure Arc Enabled Data Services allow you to provide the capabilities, functionality, and ease of management of Azure Data Services but hosted on your own infrastructure or in other clouds. This allows you to get all the latest Azure Data innovations, elastic scaling, and a single pane for all your data workloads with or without a direct connection to the cloud.

Being able to host your own PaaS services, no longer worrying about patches, upgrades and utilising system managed backups but still being able to manage those services from the Azure Portal and utilise Azure Monitor and Security systems will be appealing to many organisations who cannot fully migrate to the cloud or who want the ability to move to a different cloud provider in the future.

Come and learn about this offering and how it can help you to manage your Data Services no matter which cloud you are hosting them on.

A Managed Instance from nowt to query in 10 minutes with Azure Arc Enabled Data Services

Managed Instances in Azure take a considerable time to provision.
In this 10 minute session we will start with nowt and create a Managed Instance, restore a database to it, and query it inside 10 minutes using Azure Arc Enabled Data Services running on Kubernetes.

* nowt - Northern UK word meaning nothing

A deeper dive into dev containers for your project

Dev containers are a VS Code feature that allow us to define the perfect development environment for our project.

This enables a definition of a working development environment for all of your team or your open-source contributors to be quickly and easily provisioned and updated. It will even work in a browser! No more worrying about the right hardware, software installations, dependencies or updates. You can control it all with dev containers and make it simple and easy.

In this session we will dive deeper into dev containers and take a look at how we can use multiple containers for our development, multiple configurations for our dev containers, how to customise the entire experience and how to connect to your containers externally.

This sesssion is a natural follow on from our Why every project needs a dev container session but can also live on its lonesome

South Coast Summit 2022

October 2022 Southampton, United Kingdom

DATA:Scotland 2022

September 2022 Glasgow, United Kingdom

SQLBits 2022

March 2022 London, United Kingdom

SQL Ireland Meetup - 2022

January 2022 Dublin, Ireland

Data Saturday Portugal 2021

December 2021

New Stars of Data #3

October 2021

Azure Bootcamp South Africa 2021

September 2021

Data ANZ

June 2021

DataMinutes #1

June 2021

Power Saturday 2021

June 2021

Data Saturday Southwest US

May 2021

Global Azure Lüdinghausen 2021

April 2021

Data Ceili Dublin 2020

July 2020 Dublin, Ireland

Power Saturday 2020

June 2020 Paris, France

psconf.eu 2020

June 2020 Hannover, Germany

dataMinds Connect 2019

October 2019 Mechelen, Belgium

Data Saturday Holland

October 2019 Utrecht, Netherlands

DATA:Scotland 2019

September 2019 Glasgow, United Kingdom

DataGrillen 2019

June 2019 Lingen, Germany

Techorama Belgium 2019

May 2019 Antwerpen, Belgium

Data in Devon 2019

April 2019 Exeter, United Kingdom

SQLGLA 2018

September 2018 Glasgow, United Kingdom

SQLGrillen 2018

June 2018 Lingen, Germany

Rob Sewell

Be-Whiskered Automator of Things

Exeter, United Kingdom