Manchester, England, United Kingdom
My name’s Joe Griffin, and I’m Director & Principal Consultant at SOLO Cloud Solutions. I hold various Microsoft certifications in Dynamics CRM/365 Customer Engagement, Azure, Power BI and SQL Server and am also a PRINCE2 Project Manager and Agile Practitioner. I have a broad knowledge of working with and managing projects involving Dynamics 365 Customer Engagement, Azure, Power BI, .NET and other technologies within the Microsoft cloud stack. During this time, I’ve developed a keen interest in coding, and love working with Transact-SQL, JScript, PowerShell, DAX & C#. Thanks to my previous experience, I have practical knowledge operating within the energy efficiency, managed services, retail, print & direct mail and utility sectors. I love getting involved as part of the Business Applications technical community and was very fortunate to receive the Most Valuable Professional (MVP) award from Microsoft in 2020, in recognition of my contributions in this area. When I’m not blogging or tinkering, I enjoy reading, catching up on the latest films/TVs and also enjoy learning about and studying history.
In today’s fast-moving cloud world, it can be difficult sometimes to evaluate new solutions on the marketplace to satisfy Extract, Transform and Load (ETL) scenarios or data integration needs involving the Common Data Service (CDS). Typically, we may find ourselves returning to “safe” options such as SQL Server Integration Services (SSIS) or to look at investing in potentially expensive third-party solutions, such as Scribe Online or KingswaySoft. With version 2 of Azure Data Factory (ADF), developers now have a product available with sufficient feature parity to SSIS. In addition, ADF offers a range of additional options, such as the ability to manage your solution within Azure DevOps or seamlessly integrate with Azure functions to execute custom code. All of this, and more, can be implemented for little or no cost.
In this deep-dive session, we’ll look at how Azure Data Factory can be used to successfully migrate an on-premise instance of Dynamics 365 Customer Engagement to a cloud CDS instance. As part of this, attendees will become familiar with the process involved in deploying their first Azure Data Factory Pipeline, from start to finish. No previous experience with Azure Data Factory is required, although attendees should have some familiarity with CDS entities and how to deploy resources into Azure.
With the introduction of the Power Platform, an awareness of Power BI moves from being desirable to absolutely essential when building a truly unified and engaging business application solution. But with a plethora of existing reporting options available within D365 Customer Engagement (D365CE), attempting to identify the best usage scenario for Power BI and the most appropriate technical architecture for this can be tricky. Additional complications can also arise when traditional SQL/BI developers are having to familiarise themselves with the many oddities D365CE presents from a data access perspective. In this session, I hope to provide a “crash course” discussion on how and when to use Power BI with D365 Customer Engagement, before then looking at the various options available to access D365CE data and streamline the adoption of your Power BI solution.
Did you know that Microsoft have a web application analytics tool that is as good, if not better than, Google Analytics? It's called Application Insights and, when implemented, provides application developers and digital teams with a rich array of information that can help to identify website usage trends, bottlenecks within applications, what browsers are being used and a whole lot more. Even better, all information can be easily consumed within a Power BI dashboard or hived off to a separate reporting application for even more detailed analysis. The cherry on the cake though? Application Insights is fully compatible with Dynamics 365 Customer Engagement and, with minimal setup involved, you can begin to generate an even greater level of insight into the health of your Customer Engagement deployment. In this session, attendees will see how easy it is to get started with Application Insights and how you can fully leverage the solution to generate as much business benefit from the underlying data exposed.
Azure is big and scary. There is a lot to learn, and it's changing all the time. Even getting started with it, to begin leveraging some real benefits alongside your Dynamics 365 Customer Engagement deployment, can be a challenge. The aim of this session is to dispel any uncertainty and provide an open forum for Q&A's and troubleshooting relating to Azure. During this two-part roundtable session, we will cover basic concepts, such as subscriptions, resource groups and creating resources, before jumping into an open forum for questions and live demonstrations, to address anything raised by attendees. We will also look to provide answers to questions raised before the event, via engagement on social media.
With the introduction of the Power Platform, an awareness of Power BI moves from being desirable to absolutely essential when building a truly unified and engaging business application solution. But with a plethora of existing reporting options available within model-driven Power Apps and the Common Data Service (CDS), attempting to identify the best usage scenario for Power BI and the most appropriate technical architecture can be tricky. Additional complications can also arise when traditional SQL/BI developers are having to familiarise themselves with the many oddities CDS presents from a data access perspective. In this session, I hope to provide a “crash course” discussion on how and when to use Power BI with the CDS, before then looking at the various options available to surface CDS data and streamline the adoption of your Power BI solution.
Perhaps one of the features that makes Dynamics 365 and the Power Platform so compelling is its much-touted capability to integrate with a wide variety of data sources, regardless of where these reside. This can not only allow you to centralise all of these disparate sources into a single view but also will enable you to leverage some of the exciting capabilities within Power BI and Power Automate with minimal effort. The purpose of this talk will be to discuss, evaluate and showcase all options currently available to get your on-premise data within CDS or available for use within tools such as Power Automate. Each option has its distinct advantages and disadvantages, and the session will aim to present these and offer a suggested answer, based on the business requirement that needs to be met.
Often the most challenging thing when it comes to Project Service Automation (PSA) is getting started. There are a plethora of different configuration options, with often little or no documentation to help speed you along. This can make it a considerable challenge for any business to start using the module to a significant effect. In this session, I will aim to guide experienced Dynamics 365 / Dynamics CRM professionals on all the steps involved to build out your project delivery practice within PSA. We'll take a hands-on approach through the entire session, focusing primarily on demonstrations, alongside some written, support material. Attendees should have a good familiarity with how Dynamics 365 works, including entity customisations, the Common Data Service and working with the new unified interface.
Dynamics 365 and Power Platform Developers should feel especially blessed at the moment, as we have seen the release of not one, but two, new developer-focused exams within the space of a year. However, as part of this, we must say goodbye to exam MB-400 at the end of 2020 and embrace the new PL-400 exam. For those who have sat MB-400 or are contemplating sitting PL-400 soon, this session is for you - we will do a deep-dive analysis on the skills measured within PL-400, providing useful information on the core topics that you will be expected to know to do well in the exam.
As your Common Data Service deployment matures, it may become necessary to start leveraging some of the capabilities on offer within Microsoft Azure. Whether you wish to copy your database out into an Azure SQL database or leverage tools such as Logic Apps, there are a multitude of ways in which administrators and developers can bring Azure into the equation, often with little effort involved. In this session, we'll take a look at how you can straightforwardly "bolt-on" Azure capabilities alongside the Common Data Service, to help meet requirements such as disaster recovery, advanced reporting or in adopting a true ALM approach to managing your environments. By the end of the session, attendees will gain an understanding of some of the most common use cases for Azure in the context of the Common Data Service.
SQL Server is a fantastic solution - given this is a Data Platform conference, you would probably throw me out for not saying this! However, it worth considering how we can use solutions such as the Common Data Service (CDS), Power Apps and Power Automate to help streamline the development of an internal application and even be "bolted-on" alongside a bespoke or customer-facing application. This session will aim to introduce the fundamentals behind these technology areas, and the benefits they can deliver when utilised, framed by someone who finds ANY opportunity to use SQL Server whenever they can. Through a live demonstration, attendees will also see how easy it is to set up an entire CDS database, and all its associated components, in comparison to doing it the T-SQL way.
Azure API Management (APIM) is a great tool to use to bring together your various API endpoints into a single, unified endpoint, that can be secured via a variety of different options. However, attempting to work with this tool alongside Azure Resource Manager templates can leave you scratching your head, as you attempt to figure out the correct resources you need to build out your API from end-to-end.
If you've found yourself facing and failing to overcome this problem before, then this is the session for you - we'll talk through how to structure out your APIM deployment in the most effective way and then demonstrate how this can be deployed out into Azure.
Attendees to this session should ideally have a good grasp of working with Azure, APIM and a working knowledge of building out Azure Resource Manager templates.
When working with model-driven apps targeting Microsoft Dataverse, you have a lot of capability to automate your processes, enforce business logic and perform complex calculations without needing to resort to code. However, there will be specific situations where you'll need to look at cracking open Visual Studio and putting together a C# script to perform the operations you need - knowing the when, where and how behind all of this can be challenging though. In this session, we'll provide a beginners overview of what plug-ins are, the scenarios where they are most appropriate for use and an end-to-end demonstration of how to build one from scratch. Attendees will gain a high-level understanding of how plug-ins operate at the platform level, the overall process of how to deploy them out and some useful guidance on how to utilise them within current or future projects.
Power BI often gets overlooked by Business Application professionals, as it's more often geared towards those with a Data Platform background. Notwithstanding this, knowledge of Power BI is essential and, indeed, complimentary for many things you may end up doing with the Power Platform, such as working with Dataflows or deploying a Power BI report alongside a Power App. With the DA-100 exam, Power Platform professionals now have an ideal route to test their skills and prove their mettle with Power BI, earning a shiny Associate badge in the process.
Join me in this session, where I will provide an overview of the DA-100 exam. We'll dive deep into the various skills measured across the exam, with a particular focus towards contextualising each area to those who are already experienced with the Power Platform. Following on from the session, I hope to equip you with the insights and knowledge to tackle the knowledge gaps this exam brings up, allowing you then to revise and achieve success when sitting the exam.
Having a high-level, introductory overview of Microsoft Azure is essential for anyone working with Microsoft these days. In particular, Azure is one of the primary ways to extend out the capabilities within the Power Platform, via ProDev tools such as Functions, Data Factory or Service Bus. Setting yourself the target to sit and pass the AZ-900 exam is something I would recommend to anyone working with the Power Platform today, but this can often be easier said than done.
To help you along and remove any excuse not to sit this exam in future, this workshop will aim to provide a condensed, highly focused review and demonstration of the core concepts you'll need to grasp to do well in the exam. Attendees should anticipate a "hands-on" session, with plenty of demonstrations, knowledge checks and follow-on tasks, designed to reinforce your learning. If this all sounds good, then this is the workshop for you!
Dataflows within Power Apps provide some intriguing new capabilities for one-off data imports or ongoing integrations targeting Microsoft Dataverse or many of the Dynamics 365 Customer Engagement apps. The good thing as well is that you don't necessarily have to learn it all from scratch; if you have experience working with Power BI or Power Query, it's surprisingly straightforward to build out your integration and have this run based on your individual needs.
The purpose of this session will be to introduce Dataflows, aimed towards those with previous experience working with the classic data import experience or tools such as Scribe Online or Kingswaysoft. We'll discuss their key capabilities, highlight how they differ from Power BI dataflows, see how they work in practice and also demonstrate how a dataflow solution can be moved out into Azure to unlock additional capabilities.
The great thing about working with Microsoft Dataverse is that we can write server-side plug-ins using C#, to execute more complex business logic. But there will be occasions where, due to the complexity or the types of libraries we want to work with, this will be impractical. Enter stage left Azure Functions, which provide an alternate mechanism to build our plug-ins, and leverage additional features in the process. Join me for this session, where we will demonstrate how to build this out from scratch and touch upon some of the benefits and things to watch out for when going down this route.
In the past, if we wanted to define our own custom messages and complex business logic within Microsoft Dataverse, developers had to use Custom Actions. With the introduction of Custom API's, we have a modern and scalable way to provide this functionality and benefit from features such as enhanced security integration, localization, and the ability to create our own custom OData Functions that we can call via the Web API. Pretty cool stuff, I'm sure you'll agree. 😀
In this session, I'll provide an overview of Custom API's, how they compare to Custom Actions, and demonstrate how both low-code and pro-developers can use them.
29 Feb 2020
Glasgow, Scotland, United Kingdom
9 Jul 2019
London, England, United Kingdom
Co-organiser and also delivered my talk 'Leveraging Power BI with Dynamics 365 Customer Engagement'
9 Apr 2019
Manchester, England, United Kingdom
Delivered my talk 'Leveraging Power BI with Dynamics 365 Customer Engagement'
28 Mar 2019
Manchester, England, United Kingdom