Speaker

Nick Dickinson-Wilde

Nick Dickinson-Wilde

Backend Drupal Manager - Taoti Creative

As Backend Drupal Manager with Taoti Creative, I am responsible for a team working on a variety of concurrent projects - rarely less than a 15 in any week. When not supporting my team, I am the primary maintainer for our company dev-ops systems, and somehow fit in heads down coding time as well as. On my own builds I primarily do backend code and site building, but also participate in both estimates and planning of projects/features. I am unnofficially our evangelizer for Drupal contributions.

Standardize sites with Drupal Scaffolding

For agencies and other organizations that manage multiple or many sites, one challenge is keeping shared code in particular code based infrastructure up to date. Drupal's Scaffolding composer plug-in is one particularly effective solution. Used by hosts such as Pantheon and Amazee as well as a growing number of agencies.

In this session we'll learn the capabilities of the Scaffolding plug-in are, move on to how I use - both at work and personally with the differences in requirements there. We'll then move on to common difficulties and pitfalls in standardizing your site setup - from new sites to those old sites that kinda make you grimace - whether inherited or your own code.

Handling Low Quality Data in Drupal Migrations

So, you are migrating a site to Drupal – or just importing some data whether it is a one off or regular repeating process – ideally, you have perfectly regular data. Unfortunately, the world is rarely ideal, so what can you do when your data is very much not nice? When importing through the Migrate API, there is a lot you can do.

Drupal 7 to Drupal 8/9 often has nice data – predictable structure, most pieces of any one kind of content type are mostly the same. To some degree, on older Drupal sites though, data is often less comparable between dates. Spreadsheets of data from external sources out of your control are the often distinctly unreliable – or from many other external systems such as DotNetNuke – and be very challenging.

We’ll discuss various methods to resolve this ranging from simple migration plugin configuration adjustments to full pledged custom migration plugins. After starting we'll initially focus on yml migration plugins before moving on to using both source and process plugins to solve some real world examples of messy data.

Debugging Drupal with Xdebug

Xdebug is a PHP extension that can massively improve your debugging experience and speed – for both backend and frontend developers. Conversely, it can also drag down your php speed to a crawl, so you want to configure it to be a simple toggled – and only enable when you are using it. Xdebug works with most IDEs but I will be specifically showcasing usage with PHPStorm & VSCode. Debugging with Xdebug for many things is neither better nor worse than any other method such as var_dump(), Drupal::logger() statements etc.

Many developers find that Xdebug is faster but the reality is, breakpoints are just another tool. Xdebug usage is unrelated to seniority or development expertise. There are cases that Xdebug really excels compared to other options – such as building migrations, or other processes that run through the same code path a lot of times with slightly different variables.

Additionally, Xdebug can debug twig and let you explore available variables without worrying about memory. This use case is also often a significant speed improvement over kint() or other variable dumping methods that may show incomplete data to get around memory limits.
Xdebug has lots of functionality beyond breakpoints and variable exploration, but one I particularly like is using it like drush php`but in the middle of requests with the current variables available.

After basic installation/configuration, we will debug a spreadsheet to Drupal migration for a bit of real world usage.

Nick Dickinson-Wilde

Backend Drupal Manager - Taoti Creative