Session

Deploy large language models responsibly with Azure AI

Generative AI applications, powered by large language models (LLMs) like Meta’s Llama 2 and OpenAI’s GPT-4, have opened up new possibilities for natural language processing (NLP) tasks such as content and code generation, summarization, and search.
However, these applications also pose new challenges and risks, such as bias, toxicity, misinformation, and manipulation1. How can we leverage the benefits of generative AI while mitigating the potential harms?

In this session, we will explore how Azure AI can help you build and deploy generative AI applications responsibly, following the best practices and guidance from Microsoft’s responsible AI principles and governance systems.

You will learn how to:
- Discover and evaluate various LLMs from Azure OpenAI Service, Meta, Hugging Face, and other sources using the Azure Machine Learning model catalog2.
- Fine-tune and optimize the models for your specific use cases and objectives using Azure Machine Learning prompt flow and advanced optimization technologies.
- Deploy the models and prompt flows as scalable and secure endpoints using Azure Machine Learning Studio and SDK.
- Monitor the deployed models and prompt flows for data drift, model performance, groundedness, token consumption, and infrastructure performance using Azure Machine Learning model monitoring.
- Apply responsible AI tools and techniques to prevent harmful content generation, enforce data grounding, disclose AI’s role, and reinforce user responsibility.

Whether you are a seasoned AI practitioner or a beginner looking to expand your knowledge, this session will equip you with valuable insights and skills to operationalize generative AI with Azure AI.

Emilie Lundblad

Microsoft MVP & RD - Make the world better with Data & AI

Copenhagen, Denmark

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top