Session

Protect your financial ML.NET workloads with Confidential Computing

Preserving privacy when processing data from multiple sources with machine learning is always a challenge. Organizations may want to perform collaborative data analytics while guaranteeing the privacy of their individual datasets. Combining multiple data sources to support a better algorithmic outcome improves accuracy of prediction, but it may come at cost of confidentiality, if sensitive information is not accurately protected.
Azure Confidential Computing adds new data security capabilities to the Cloud and specifically to machine learning processing. By using trusted execution environments (TEE) to protect your data while in use, with confidential computing, you can use machine learning algorithms across different organizations to better train models, without revealing the processed data.
This session presents the benefits of confidential computing in a Machine Learning solution, where different financial institutes share their confidential datasets for data analysis and credit risk prediction using the ML.NET library, and still mask any sensitive information to protect the privacy of their customers, preventing any potential data leakage.

Stefano Tempesta

Web3 Architect & CTO | AI & Blockchain for Good Ambassador

Gold Coast, Australia

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top