Session

Privacy-Preserving Framework for collaborative machine Learning on sensitive Data

Deep learning Algorithms are data-hungry, the more data they have the more they are able to generalize really well on unseen data. Even though, efforts have been put on gathering and publishing huge datasets for unsensitive data to achieve the aforementioned effect, the same cannot be done for sensitive data for obvious reasons. This has made development of large robust models in areas with sensitive data like finance, healthcare limited to large organizations with lots of data.

The alternative, sharing of data across healthcare/financial practitioners, could help with development of capable models due to the variety of rich data that they possess, however, data security and privacy concerns arise .

In my session I intend to showcase a framework for sharing sensitive information across organizations for collaborative training of one deep learning model in a privacy preserving way using autoencoding and differential privacy.

Brackly Murunga

Portfolio Data Scientist @ M-KOPA

Nairobi, Kenya

Actions

Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.

Jump to top