Session

Managing the work of large language models (LLMOPS) in Azure ML Studio and Azure AI Studio

Considering the rapid development of neural networks, frequent updates of the source data on which we fine-tune the models, the increasing complexity of systems using several models simultaneously to improve result accuracy - it is very important to constantly monitor the quality of your solution's performance. Especially when changing prompts, switching to new improved models, and updating the source data. We will discuss two key Microsoft solutions that will allow you to manage the quality of your solution's responses through the simulation of executing multiple user requests and analysis of the results provided. As well as monitoring the performance and quality of your solution based on a large language model.

Artem Chernevskiy

Azure AI Platform MVP

Razlog, Bulgaria

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top