Speaker

Anurag Singh

Anurag Singh

Director IT Analytics and Gen AI

Bengaluru, India

A Visionary Leader in Gen AI/Open AI, Data Science/Machine Learning, Big Data Space. Technical Architect pursuing Reinforcement Learning from IISC Bangalore, Design Thinking Certificate from IIT Madras 2019,Executive Certificate in Big Data Analytics from SP JIMR, Mumbai 2017,Internship Certificate in Data Science using R from IIM Lucknow 2017. Areas of interest are Big Data Analytics Machine learning, Artificial Intelligence|Cognitive Services and Block Chain.
15+ years of experience in software application development. Currently a full time Ge AI, Data Science and Machine Learning Practitioner at Honeywell Technology Labs Bangalore India. 7 yr.'s experience in Microsoft Web Technologies comprising mostly of ASP.Net related web front end technologies, distributed services and database development.
Supported clientele: Schneider Electric, Kimberly Clark, Diageo Business Services, Anheuser-Busch InBev, Wells Fargo Centre-India Centre of Emerging Technologies Lab, Zurich Insurance (Switzerland), Qatar Airways (Doha Qatar), JPMorgan (New York), Tesco (UK and Ireland).Currently associated with Diageo Business Services India.
2 years of Onsite Experience at client Qatar Airways handling and owning end to end development of software systems.
Hold valid B1 visa expiring 2025. Experience in project offshoring, project setup and developing and deploying advanced analytics globally.

Area of Expertise

  • Information & Communications Technology

Topics

  • Machine Learning and Artificial Intelligence
  • Data Science & AI
  • Azure Machine Learning
  • Microsoft Azure
  • Snowflake

Lakehouse Paradigm the convergence of DataLakes and DataWarehouses

The lakehouse platform has SQL and performance capabilities — indexing, caching and MPP processing — to make BI work rapidly on data lakes. It also provides direct file access and direct native support for Python, data science and AI frameworks without the need to force data through an SQL-based data warehouse. Find out how the lakehouse platform creates an opportunity for you to accelerate your data strategy.

In this session, we’ll cover:

The evolution of data management solutions
The emergence of lakehouse architecture to store all types of data
How the lakehouse enables all analytics workloads — BI, SQL analytics, data science and machine learning

Exploring GitHub Co Pilot

In this session we would see how we can leverage the power of generative AI for pair programming

Announcing Vector Search in Azure Cognitive Search Public Preview

In this session I will uncover the exciting possibilities of vectors in Azure Cognitive Search (preview) in tandem with Azure OpenAI Service functions. Discover how these powerful tools can enhance search capabilities, making your AI solutions smarter and more intuitive. Experience the accuracy and relevance of vector + text “hybrid search,” while tapping into the advanced natural language understanding and generation capabilities of Azure OpenAI Service. Revolutionize information retrieval and take your language-based applications to the next level within the Azure ecosystem!

How to Explain Models with Interpret ML

With the recent popularity of machine learning algorithms such as neural networks and ensemble methods, etc., machine learning models become more like a 'black box', harder to understand and interpret. To gain the stakeholders' trust, there is a strong need to develop tools and methodologies to help the user to understand and explain how predictions are made. In this video, you learn about our open source Machine Learning Interpretability toolkit, InterpretML, which incorporates the cutting-edge technologies developed by Microsoft and leverages proven third-party libraries. InterpretML introduces a state-of-the-art glass box model (EBM), and provides an easy access to a variety of other glass box models and blackbox explainers.

Complement your Data Lake with Azure Databricks Delta Lake

How to make your Data Lake ACID Complaint using the Azure Databricks Delta Lake approach. This helps in preventing your Azure Data Lake to become Data Swamps also we would cover schema evolution in this demo based session

Machine Learning From Proof of Concept to Production

In this session I will describe how we leverage Azure Data Bricks and ML Flow along with Azure Machine Learning service to solve some of the most complex problems which the Data Science community faces. Tracking the machine learning models and the data sets, versioning the machine learning models. Creating an organisation wide model repository for discovery within the organisation this in turn save many others from rebuilding the wheel.We will touch upon how to create images for the machine learning models and deploy the same in production using Azure Kubernetes services or Azure Container Services.

Moving from ML-Centric to Data-Centric Approach

Problem Statement

Model-centric vs Data-centric/AI Bias

Need for Data-centric Platform

Essential Capabilities of Data-centric platform

Have presented this topic in APAC AI Accelerator Festival 2021 July. 50 minutes presentation and 10 minutes of QnA.

Building Blocks of Data Science Data Engineering

Problem Statement
What is Data Engineering?
Quick Introduction to Data Engineering process
Data Engineering Tech Stack
Learning Path for Data Engineering
Live Q&A

50 minutes presentation followed by 10 minutes QnA. Presented in ANTWAK Master Class, Elite Techno Groups Master Class, Speakin Master Class

Hands on Azure Machine Learning Supervised and Unsupervised Learning Techniques

We would try to Cover an example each of Supervised Learning and Unsupervised Learning using Azure Machine Learning.
Create a Regression Model with Azure Machine Learning designer
Create Azure Machine Learning Workspace
Create Compute Resources
Explore Data
Create and Run Training Pipeline
Evaluate a Regression Model
Create an Inference pipeline
Deploy a Predictive Service

Create a Clustering Model with Azure Machine Learning designer
Create Azure Machine Learning Workspace
Create Compute Resources
Explore Data
Create and Run Training Pipeline
Evaluate a Clustering Model
Create an Inference pipeline
Deploy a Predictive Service

Department of Statistics, CUSAT

Implementing Responsible AI using Error Analysis Toolkit

Error Analysis, a new Responsible AI open source toolkit, enables machine learning practitioners to identify model errors and diagnose the root causes behind these errors, helping to build responsible, reliable, and trusted solutions.

With a lot of buzz and emphasis being put on Responsible AI this session would make the people familiar with Open Source Error Analysis tool with a demo of the functionality an dhow to use it in Azure

Share Data Simply and Securely using Azure Data Share

Azure Data Share offers a simple pane of glass over your data sharing relationships. In this session, you will learn how to easily provision a new data share, add datasets to it, specify your terms of use and invite recipients. we will walk through how you can stay in control of your data through monitoring and governance features which ensure you are always in control of your data.

Demo session max 45 minutes with challenges in traditional data sharing and demo of Azure Data Share

Govern your data wherever it resides with Azure Purview

Azure Purview is a unified data governance solution that helps you manage and govern your on-premises, multicloud, and software-as-a-service (SaaS) data. Easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification and end-to-end data lineage. Enable data consumers to find valuable, trustworthy data.

Demo of setting up Purview and showing Azure Purview scans and map all your data—no matter where it is

Open Source foundation models in Azure Machine Learning

I can help you learn more about Open Source foundation models in Azure Machine Learning.
Open Source foundation models are machine learning models that have been pre-trained on vast amounts of data, and that can be fine-tuned for specific tasks with relatively small amount of domain-specific data. These models serve as a starting point for custom models and accelerate the model building process for a variety of tasks including natural language processing, computer vision, speech and generative AI tasks1.

Model Catalog: Azure Machine Learning studio’s hub for various pre-trained models for language, speech, and vision tasks1.
Model Operations: Discover, evaluate, fine-tune, and deploy models with Azure’s native capabilities, ensuring enterprise-grade security and data governance23.
Model Collections: Access curated open-source models by Azure AI, exclusive Azure OpenAI models, and HuggingFace hub transformers for real-time inference.
Licensing & Preview: Understand third-party licenses and preview terms for models in the catalog before using them for your specific workload.

Architecture Patterns for Gen AI Applications

This session was presented in AZCONF2023 Asia's biggest conference in AI and Cloud.I will take you through some of the most common usage patterns we are seeing with customers for Generative AI. We will explore techniques for generating text and images, creating value for organizations by improving productivity. This is achieved by leveraging foundation models to help in composing emails, summarizing text, answering questions, building chatbots, and creating images.

Architecture Patterns in:
Text Generation
Summarization
ChatBot
Question Answering

Prompt flow: an end to end tool to streamline prompt engineering

Large Language Models (LLMs) have revolutionized the field of natural language processing, enabling a wide range of applications such as chatbots, summarizers, translators, and more. However, developing LLM-based apps is not an easy task. It requires a lot of trial and error, fine-tuning, testing, and deployment. Moreover, it involves working with different tools and platforms, such as LLMs, prompts, Python code, and cloud services. Introducing Prompt flow, a suite of development tools designed to streamline the end-to-end development cycle of LLM-based AI applications, from ideation, prototyping, testing, evaluation to production deployment and monitoring. Prompt flow makes prompt engineering much easier and enables you to build LLM apps with production quality.

Develop a flow
Evaluate a flow
Deploy a flow

Anurag Singh

Director IT Analytics and Gen AI

Bengaluru, India

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top