Session

Securing your Azure AI Workload

You've been given the task of creating an AI workload that will work with your data to allow your customers to get answers. Perhaps you even want to use some of these new "agent" things you've been hearing so much about. Maybe you need to see how your regular apps can also integrate with your workload.

But AI is scary. Just ask Samsung how they felt when their proprietary code showed up in chat GPT answers. Just look to the Canadian Airline that had to pay a refund because their chat agent said the customer should get one. Or how about the guy who negotiated a truck for $0? It's challenging and scary - and that's just the implementation. Just to be clear, learning how to ground your data properly is an entirely different session - and while we'll talk about that briefly, this session is more about the workload in Azure itself.

What about the exposure of your data via your azure services? How can you keep your HIPPA/GDPR/NIST certifications *and* integrate an azure AI workload?

In this session, you will learn about how you can correctly utilize Azure services to secure your AI workload from public exposure, and how you can connect securely to the AI deployments from your other secure services.

AI security and governance are a critical concern, and while talks on the flashy things AI can do are generally more sexy, we need to make sure we know how to keep the workloads secure in our deployment architectures.

Brian Gorman

Microsoft Azure MVP, Speaker, Author, Trainer, and .Net Developer

Waterloo, Iowa, United States

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top