Artificial Intelligence DevOps devops security azure devops security AI & Machine Learning
A result-driven software engineer with expertise in community building :) // I teach Microsoft technologies - especially cloud solutions based on Microsoft Azure and Azure Stack.
During this workshop, participants will learn how to analyze video contents using Azure Cognitive Services. This is made possible with the Azure Video Indexer API, which helps in transcribing uploaded videos on Microsoft Azure.
Combining the power of VideoIndexer and Microsoft Azure, the audience would be walked through the usage of the VideoIndexer API service to analyze, translate, transcribe and generate video and image information for precise indexing on search engines.
This tool transcribes and categorizes videos. Then, participants will use the VideoIndexer API to retrieve their videos from a web app.
During this session, the audience would be walked through building an AI-powered web app using Python and Flask running on Microsoft Azure. This web app analyses faces using AI and stores information about the emotion of faces in the image, notifying the user if it detects sad faces multiple times. The web app also serves up a simple HTML page showing the data captured.
Once a picture has been successfully taken, a Web API - built in Python - will receive the picture, and analyze it using the Azure Face API from Azure Cognitive Services (an AI service that can recognize faces in images, as well as estimating the age of the face, if the person is smiling amongst other things).
The Web API will use this cognitive service to detect the emotion of all the faces. This will then be saved into a database called CosmosDB - a document database. These documents contain key/value pairs of data stored in a format called JSON. The API would also return a count of emotions, which the python app would use in asking wether the user is ‘Okay’ when the number of sad faces is greater than or equal to 3.
The app basically selects an emotion, and a user has to try their best to show that emotion on their face. Once they have their best emotion face on, they take a picture with a camera and the web game will check to see what emotion they are showing using the Azure Face API. If it is the same as the one they were asked to show, the app returns a success, and a failure, if otherwise.
This is a simple illustration around how this technology could be used as a self care app.
The web-based software runs Python Flask under the the hood, which includes a Flask API that uses an Azure cognitive service package to detect the emotion of all the faces. The emotion information is sent as a set of properties on the emotion with a value for how likely it is that that particular emotion is truly shown on a scale of 0.0-1.0, 0.0 being not likely and 1.0 being very likely.