Most Active Speaker

Emilie Lundblad

Emilie Lundblad

Microsoft MVP & RD - Make the world better with Data & AI

Copenhagen, Denmark

Actions

Microsoft MVP in AI | Microsoft Regional Director | Board Member at Pioneer Center for AI & Danish Data Science Community

Emilie Lundblad stands at the forefront of Data & AI, holding prestigious roles such as Microsoft MVP in AI and Microsoft Regional Director. Her profound expertise spans across leveraging AI in multi-agent systems for chemical simulations and manufacturing. As an esteemed board member at the Pioneer Center for Artificial Intelligence and the Danish Data Science Community, she actively contributes to shaping the future of AI and data sciences.

Educationally, Emilie is well-rounded with a Master's in Quantitative Finance from the University of Southern Denmark and a minor in Machine Learning & AI from MIT. She is completing her Executive Master’s in IT Management at the IT University of Copenhagen, further enriching her vast knowledge by guest lecturing.

With over 14 years of professional experience, Emilie’s career highlights include roles as MD at Amesto, Head of Business Intelligence at GroupM, and EMEA Innovation Lead at WPP. A recognized thought leader, she has delivered impactful keynotes at venues like IDA, Danish IT, the Danish Government IT Conference, and Microsoft events.

Emilie's collaborative projects with institutions like MIT and the Technical University of Denmark focus on applying machine learning to challenges such as dynamic pricing and sustainable solutions, underpinning her commitment to enhancing business efficacy and sustainability.

Her accolades include nominations for Hyperight’s Top 100 Nordic Data & AI Influencers 2023, Berlingske Talent 100, and the Nordic Women in Tech Award for Diversity Leader of the Year, showcasing her as a beacon of innovation and leadership in the Data & AI sector.

Awards

  • Most Active Speaker 2023

Area of Expertise

  • Information & Communications Technology

Topics

  • Microsoft AI
  • Microsoft Data Platfom
  • Microsoft Power Automate
  • Microsoft Power BI
  • Microsoft Azure Cognitive Services
  • Data Science & AI
  • Big Data
  • Data Analytics
  • Data Visualization
  • Azure Data Factory
  • Azure Data & AI
  • Azure Data Lake
  • Data Science
  • Azure Data Platform
  • Analytics and Big Data
  • Data Privacy
  • Microsoft OpenAI
  • CHAT-GPT
  • Sustainanble Development
  • AI & ML Solutions
  • AI for Social Good
  • AI & ML Architecture
  • AI Ethics
  • AI for Startups
  • generative ai
  • Generative Coding
  • OpenAI

Chat with your SQL Database

Agenda:

Introduction (10 minutes): Brief overview of Azure AI, AI Search, OpenAI service, and the concept of natural language querying.

Setting Up Azure AI Services (10 minutes): Demonstration of how to set up Azure AI services and navigate Azure OpenAI Studio.

Understanding SQL Databases (10 minutes): Introduction to SQL databases and how to structure queries for efficient data retrieval.

Implementing Azure AI Search (10 minutes): Showcase of how to create a search service and implement search functionality for your chatbot to interact with SQL databases.

Integrating Azure OpenAI Service (10 minutes): Explanation of how to apply advanced AI language models to your search solutions using Azure OpenAI Service.

Creating a Natural Language Interface (5 minutes): Demonstration of how to build a natural language interface for querying your SQL database.

Q&A Session (5 minutes): Open discussion and answers to participant queries.

Learning Outcomes: By the end of this session, participants will have a clear understanding of how to leverage Azure AI, AI Search, and OpenAI service to create a natural language interface for querying an SQL database.

Who Should Attend: This session is ideal for developers, AI enthusiasts, and anyone interested in exploring the capabilities of Azure AI, AI Search, and OpenAI service.

Please note that this is a high-level overview and there are many more detailed features and capabilities within each of these categories.

Create your own private ChatGPT in Azure

60-minute session where I will explore the process of creating a private corporate chatbot using Azure AI, AI Search, and OpenAI service. This session is designed to provide a hands-on experience, demonstrating how these technologies can revolutionize your chatbot development process.

Agenda:

Introduction (10 minutes): Brief overview of Azure AI, AI Search, and OpenAI service.

Setting Up Azure AI Services (15 minutes): Demonstration of how to set up Azure AI services and navigate Azure OpenAI Studio.

Implementing Azure AI Search (15 minutes): Showcase of how to create a search service and implement search functionality for your chatbot.

Integrating Azure OpenAI Service (10 minutes): Explanation of how to apply advanced AI language models to your search solutions using Azure OpenAI Service.

Creating a Private Corporate ChatGPT (5 minutes): Demonstration of how to separate your language model from your knowledge base to create a private corporate ChatGPT.

Q&A Session (5 minutes): Open discussion and answers to participant queries.

Learning Outcomes: By the end of this session, participants will have a clear understanding of how to leverage Azure AI, AI Search, and OpenAI service to create a private corporate chatbot.

Who Should Attend: This session is ideal for developers, AI enthusiasts, and anyone interested in exploring the capabilities of Azure AI, AI Search, and OpenAI service.

Please note that this is a high-level overview and there are many more detailed features and capabilities within each of these categories.

Unleashing the Power of AI, Data Science, and Machine Learning with Microsoft Fabric

60-minute session where I dive into the world of Microsoft Fabric, exploring its AI, Data Science, and Machine Learning capabilities. This session is designed to provide a hands-on experience, demonstrating how Microsoft Fabric can revolutionize your data analytics and machine learning workflows.

Agenda:

Introduction (10 minutes): Brief overview of Microsoft Fabric and its capabilities.

AI Capabilities (10 minutes): Demonstration of how Microsoft Fabric integrates technologies like Azure AI studio, Azure Synapse Analytics, and Power BI to create everyday AI experiences.

Data Science Capabilities (15 minutes): Walkthrough of end-to-end data science workflows in Microsoft Fabric, from data exploration and preparation to experimentation and modeling.

Machine Learning Capabilities (15 minutes): Showcase of how to create, track, and manage machine learning models in Microsoft Fabric.

Q&A Session (10 minutes): Open discussion and answers to participant queries.

Learning Outcomes: By the end of this session, participants will have a clear understanding of how to leverage Microsoft Fabric’s AI, Data Science, and Machine Learning capabilities to enhance their data analytics and machine learning projects.

Who Should Attend: This session is ideal for data scientists, AI enthusiasts, machine learning practitioners, and anyone interested in exploring the capabilities of Microsoft Fabric.

Please note that this is a high-level overview and there are many more detailed features and capabilities within each of these categories. Let me know if you need more specific information about any of these areas.

Elevating E-Commerce with AI & Real-Time: A Microsoft Fabric Approach


Using Fabric as a full blown AI and ML platform can be intriguing and cumbersome to get started with.

In this session we will guide you through the bits and bytes of building a machine learning/ ai model in Fabric with the build in services.
With a stepstone of improving customer experience on an e-commerce platform we get started with the basic steps.

Product and customer profiling using AI and from this building a recommender engine to use in the webshop will also be a part of the session.
All of this work needs governance and orchenstration - we'll guide you through the build in ML-Flow in Fabric which gives you OOTB model orchestration and a bit of governance to your work with AI/ML models.

After the AI/ML model has been build, we then step into the real-time analytics space and looks at the events from the wepshop with live-alerts and bleeding edge real-time dashboards (not Power BI) to get live insigts into the customers' behaviour on the site.

From this level 200 session we hope you will get home with eleveted knowledge of all the news and services in Fabruic to build a ML model with real-time insights.

AI & ML in fabric

1. Introduction: Explore how AI & ML, integrated with Microsoft Fabric, can transform customer experiences on e-commerce platforms.
2. Customer and Product Profiling: Demonstrate profiling customers and products in Power BI, utilizing data from Microsoft Fabric's OneLake, to derive actionable insights.
3. Building a Recommender Engine: Show how to leverage these insights to transform a Large Language Model (LLM) into an effective product recommendation engine, aimed at boosting basket sizes and sales performance.
4. Interactive Chatbot Creation: Illustrate the development of a chatbot using Azure AI Studio, capable of handling product inquiries and customer complaints, by accessing the OneLake's extensive product database and historical order data.
5. Enhancing Chatbot Responses: Detail the process of utilizing Microsoft Fabric’s notebooks for improving chatbot interactions, emphasizing on analysis, comparison, and scoring of chatbot responses against human interactions, with help from prompt flow.
6. Data-Driven Decisions: Highlight how to use the insights gathered from the chatbot and recommendation engine to inform business strategies.
7. Visualizing Impact: Showcase the integration of these AI models with real-time data visualization tools in Microsoft Fabric, emphasizing their effect on customer engagement and sales trends.
8. Model Orchestration: Demonstrate the orchestration of these models in Microsoft Fabric for real-time inference, ensuring seamless and efficient operations.
9. Scalability and Collaboration Features: Discuss the scalability of Microsoft Fabric's AI & ML solutions and its collaborative environment for teams.
10. Q&A

Microsoft's collaboration with Hugging face

In this session, I take you on a chronological journey through the groundbreaking collaboration between Microsoft and Hugging Face, and its transformative impact on AI and Machine Learning.

We’ll start by exploring the origins of the partnership, which began with the development of Hugging Face Endpoints, a machine learning inference service underpinned by Azure ML Managed Endpoint.
This initial collaboration set the stage for a series of innovative developments that have revolutionized the AI landscape.

Next, we’ll delve into the launch of the Hugging Face Model Catalog on Azure. This catalog, filled with thousands of popular Transformers models from the Hugging Face Hub, is directly available within Azure Machine Learning Studio.
I will highlight how this integration allows users to deploy Hugging Face models on managed endpoints, running on secure and scalable Azure infrastructure.

We’ll then discuss the challenges faced in deploying Transformers to production and how this collaboration has addressed these issues.
We’ll highlight how the partnership has simplified the deployment process of large language models and provided a secure and scalable environment for real-time inferencing.

Finally, we’ll explore the future of this collaboration and its implications for the AI industry. We’ll discuss how the integration of Hugging Face’s open-source models into Azure Machine Learning represents Microsoft’s commitment to empowering developers with industry-leading AI tools.

Takeaways:
- Understanding of the Microsoft and Hugging Face collaboration and its impact on AI and Machine Learning.
- Insights into the features and benefits of the Hugging Face Model Catalog on Azure.
- Knowledge of the challenges and solutions in deploying Transformers to production.

Join and discover how this collaboration is shaping the future of AI and Machine Learning.

OpenAI's models in Azure Machine learning studio

Unleash the power of AI with Azure! Dive into the exciting world of OpenAI models and discover how to fine-tune them using Azure Machine Learning. Here’s what you can expect:

Uncover the secrets of the new collection of OpenAI models in the Azure Machine Learning model catalog, all equipped with fine-tuning capabilities.

- Learn how Azure Machine Learning brings to light the fine-tuning of OpenAI models.
- Experience how Azure Machine Learning amplifies fine-tuning to deliver a rich and engaging user experience.
- Master the art of fine-tuning and evaluating Azure OpenAI models in the model catalog.
- Become proficient in deploying your fine-tuned models into the Azure OpenAI Service.
- And that’s not all! Witness a live demo of indexing and searching your own data with GPT Fine-tuning best practices. Don’t miss out on this opportunity to elevate your AI skills to new heights!

Best after Ignite 2023.

Discover vision models in Azure Machine learning model catalog

An enlightening session where we explore the new vison models in model catalog on the Azure Machine Learning platform.
This session will provide you with a comprehensive understanding of various vision models for image classification, object detection, and image segmentation.

We’ll delve into the practical aspects of using these models, including how to fine-tune, evaluate, and deploy them using Azure Machine Learning Studio. The session will also highlight the importance of responsible AI and the ethical use of technology.

Whether you’re a seasoned AI practitioner or a beginner looking to expand your knowledge, this session will equip you with valuable insights and skills. Don’t miss out on this opportunity to learn from the experts and enhance your AI capabilities with Azure Machine Learning.

Takeaways:

- Understanding of various vision models in Azure Machine Learning’s model catalog.
- Practical knowledge on fine-tuning, evaluating, and deploying these models.
- Insights into responsible AI and ethical use of technology.

Looking forward to seeing you there!

OpenAI’s models in Azure machine learning service

Learn how to use the OpenAI text, image and code models, within Machine learning service.
We will dive into how set the prompts, filters and how to fine tune the models.

We will go through 3 different tutorials:
1. How to create a book with Text-davinci & dallé
2. How to create a digital assistant
3. How to improve code with codex, github co-pilot & Chatgpt for machine learning services

After this session you should be ready to start using OpenAI models in your daily work in machine learning service.

30 to 90 minutes

Prebuild AI in Azure - AI studio

Learn how to use prebuilt AI in Azure AI studio, from OpenAI, and previously Cognitive services.

In this session, we will see how to use the services in AI studio and with Python in a notebook in Machine learning service.

We will dive into the possibilities of the tools and how to easily get started.
We will go through 4 different tutorials:
- Speech
- Vision
- Language
- Responsible AI

After this session, you should be ready to start using AI studio as part of your data platform.

60 minutes unless some services are cut out.

AI for Coding

AI for coding will focus on using chatGPT, codex and Github co-pilot to write code.

Github co-pilot & Codex
o What can but use it for?
o How to use Github co-pilot?
o What is the difference between Codex & Github co-pilot?
o Copilot vs. Copilot Labs vs. chat with amazing possibilities

ChatGPT
o When should you NOT use ChatGPT to write code?
o When should you use ChatGPT to write code?
o How can ChatGPT help in debugging?

60 minutes, but can be both cut to 30 minutes or extended to 90 minutes.

Generative AI with Prompt flow in Azure machine learning studio

Learn how to use Generative AI with Prompt flow in Azure Machine learning service.

Generative AI with Prompt flow in Azure machine learning studio is a session that teaches you how to:

Use Prompt flow, a new feature in Azure machine learning studio, to create and manage generative AI models.
- Ask questions to Wikipedia
- Q&A with your own data
- Webclassification
-
Generate text, images, code, and other types of content using pre-trained models or custom models.
Apply best practices and ethical principles for using generative AI in various domains and scenarios.

Best after Ignite 2023

How OpenAI's collaboration with Microsoft will change the world

How OpenAI’s collaboration with Microsoft work will change the world & how we work

Key takeaways:
- You will gain perspective on LLM’s, multimodal models & AGI - Artificial General Intelligence.
- You will learn about OpenAI’s collaboration with Microsoft, why it has been instrumental in delivering a wide range of research, products & tools.

- In addition, we will dive into the research of OpenAI and explain how combining them leads towards AGI - Artificial General Intelligence.
- Touch upon news from OpenAI’s developer day
- Fast-paced walk-through of useful tools like github copilot labs & chat, ChatGPT, edge copilot, & OpenAI on Azure.
- You will know how the research-based facts on why AI can improve your efficiency & quality of work
- You will learn why Satay Nadella claims this to be the era of AI, why you should stay tune to Ignite and how AI will change both our daily work and leadership

You will not look at work or AI the same way after this session.
Explore the transformative potential of AI, discussing how AI technologies like ChatGPT and Microsoft's AI Copilot could revolutionize work & leadership from scratch.

LLMOps: Operationalizing and Managing Large Language Models using Azure ML

Large language models (LLMs) like GPT-4 have revolutionized the field of natural language processing (NLP) with their ability to generate human-like text and perform various tasks based on the input provided. However, to fully unlock the potential of these pre-trained models, it is essential to streamline the deployment and management of these models for real-world applications. This session will guide you through the process of operationalizing LLMs, including prompt engineering and tuning, fine-tuning, and deployment, as well as the benefits and challenges associated with this new paradigm.

You will learn how to:
- Access and discover various LLMs from Azure OpenAI Service, Hugging Face, and other sources using the Azure Machine Learning model catalog.
- Tune the prompts and fine-tune the models for domain-specific grounding using Azure Machine Learning prompt flow and advanced optimization technologies.
- Deploy the models and prompt flows as scalable and secure endpoints using Azure Machine Learning Studio and SDK.
- Monitor the deployed models and prompt flows for data drift, model performance, groundedness, token consumption, and infrastructure performance using Azure Machine Learning model monitoring.
- Apply responsible AI principles and best practices to ensure ethical and compliant use of LLMs.

Whether you are a seasoned AI practitioner or a beginner looking to expand your knowledge, this session will equip you with valuable insights and skills to operationalize LLMs with Azure Machine Learning. Join us for this exciting journey and discover how LLMOps is shaping the future of AI and Machine Learning.

Deploy large language models responsibly with Azure AI

Deploy Large Language Models Responsibly with Azure AI Studio
Generative AI applications, powered by large language models (LLMs) like Meta’s Llama 2 and OpenAI’s GPT-4, have transformed the generative AI (GenAI) landscape. These advancements unlock incredible potential for deploying sophisticated GenAI solutions. However, they also introduce new challenges, such as risks related to monitoring and evaluation, biases, hallucinations, prompt injection vulnerabilities, and potential misuse.
To leverage these benefits responsibly, Microsoft’s Responsible AI principles and the governance framework provided by Azure AI Studio offer a powerful structure for developing, deploying, and managing LLM-based solutions.
In this talk, we’ll cover how Azure AI Studio enables the responsible deployment of generative AI applications, with a structured approach across the following critical stages:
Discovering and Exploring LLMs in the Azure AI Model Catalog
Azure AI Studio’s Model Catalog offers a comprehensive repository where you can explore, evaluate, and select LLMs like GPT-4 and other models from providers such as Hugging Face and Meta. We’ll discuss the catalog’s capabilities for model selection based on criteria like efficiency, bias, and alignment with specific use cases.

Fine-tuning and Optimizing Models with Azure AI Studio’s Prompt Flow
Azure AI Studio supports fine-tuning and prompt engineering through its Prompt Flow feature. We’ll explore how to optimize models to align with your objectives, using techniques such as prompt testing and reinforcement learning from human feedback (RLHF) to create precise and context-aware interactions.

Deploying Models as Secure and Scalable Endpoints
With Azure AI Studio, deploying models as secure, scalable endpoints is straightforward and robust. We’ll cover managed online endpoints, private endpoints for secure access, and how to utilize scaling features to handle high-traffic applications effectively.

Monitoring and Evaluating for Responsible Use with Model Evaluation and Monitoring
Monitoring is essential for maintaining model performance and ethical use. Azure AI Studio provides built-in tools for monitoring data drift, token consumption, groundedness, and detecting hallucinations. We’ll cover how these tools ensure ongoing model accuracy, reliability, and compliance with Responsible AI principles.

Applying Responsible AI Tools to Safeguard Ethical AI Use
Azure AI Studio integrates Microsoft’s Responsible AI principles through tools like the Responsible AI dashboard, error analysis, interpretability tools, and Azure Content Safety. We’ll explore how these tools help developers and organizations safeguard against harmful or biased content generation, ensuring that LLM applications align with ethical standards and transparency.
Conclusion
Azure AI Studio offers a comprehensive, responsible, and secure environment to build, deploy, and manage generative AI solutions effectively. Whether you’re a seasoned AI developer or new to LLMs, Azure AI Studio’s features help ensure that your applications are high-performing and aligned with Microsoft’s Responsible AI standards.

Microsoft & OpenAI's Journey Towards AGI

Artificial General Intelligence (AGI) is the ultimate goal of AI research, where machines can perform any intellectual task that humans can. In this session, we will explore how the collaboration of Microsoft and OpenAI, two of the leading organizations in AI, are bringing us closer to AGI.

We will cover the following topics:

- The vision and mission of OpenAI, and how it is pursuing AGI through various research domains, such as games, debate, text, image, music, audio, robotics, and code generation.

- The strategic partnership between Microsoft and OpenAI, and how it is enabling the development of AI supercomputers, such as Azure Machine Learning, tailored for OpenAI’s groundbreaking projects, such as GPT-4 and DALL-E.

- The latest innovations in Large Language Models (LLMs) and Multimodal LLMs (MLLMs), and how they are reshaping the frontiers of AI with their ability to generate human-like text and images and perform various tasks based on the input provided.

- The emerging applications and use cases of LLMs and MLLMs, such as content and code generation, summarization, search, mind-reading, emotion decoding, and more.

- The challenges and opportunities of deploying and managing LLMs and MLLMs, such as model size, GPU limitations, computational efficiency, model training, and more.

- The ethical and social implications of LLMs and MLLMs, and how Microsoft and OpenAI are applying responsible AI principles and best practices to ensure ethical and compliant use of the technology.

Whether you are a seasoned AI practitioner or a beginner looking to expand your knowledge, this session will equip you with valuable insights and skills to understand and leverage the power of LLMs and MLLMs, and how they are paving the way for AGI.

How to Ensure Microsoft GenAI Solutions are Both Responsible and Compliant to the EU AI Act

Description: As generative AI (GenAI) continues to revolutionize industries, ensuring that these solutions are responsible and compliant with regulatory frameworks like the EU AI Act is paramount. In this session, Emilie Lundblad Director of AI & Automation at Hempel, will explore how to build and deploy GenAI applications that align with both responsible AI principles and the stringent requirements of the EU AI Act. Attendees will gain insight into the operationalization of responsible AI through Microsoft's trusted framework and learn best practices for addressing compliance challenges in high-risk AI scenarios.

The session will cover:
- An overview of generative AI and its transformative role in business.
- Key provisions of the EU AI Act, including high-risk AI classifications and prohibited practices.
- Microsoft's approach to responsible AI, ensuring fairness, transparency, and accountability.
- Real-world examples of compliant GenAI implementations in Microsoft Copilot and Azure AI.
- Best practices and steps for aligning AI solutions with the EU AI Act's requirements.

Also booked for DrivingIT in Denemark in November

Microsoft Learn Zero to Hero Community User group Sessionize Event

December 2023, May 2024, November 2024

Triangle Area SQL Server User Group (TriPASS) User group Sessionize Event

July 2024, October 2024

Data Saturday Oslo 2023 Sessionize Event

September 2023 Oslo, Norway

Data Saturday Denmark - 2023 Sessionize Event

March 2023 Kongens Lyngby, Denmark

Extend Women in Tech Podcast User group Sessionize Event

February 2023

Azure User Group Sweden User group Sessionize Event

February 2023

Microsoft Fabric Usergroup Denmark User group Sessionize Event

January 2023

Emilie Lundblad

Microsoft MVP & RD - Make the world better with Data & AI

Copenhagen, Denmark

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top