EVENT TALKS

Building Differentially private Machine Learning Models Using TensorFlow Privacy

Chang Liu | TMLS2019

TDS Editors
Towards Data Science
2 min readMar 24, 2020

--

A talk from the Toronto Machine Learning Summit: https://torontomachinelearning.com/

About the speaker

Chang Liu is an Applied Research Scientist and a member of the Georgian Impact team. She brings her in-depth knowledge of mathematical and combinatorial optimization to helping Georgian’s portfolio companies. Chang holds a Master’s of Applied Science in Operations Research from the University of Toronto, where she specialized in combinatorial optimization. She also holds a Bachelor’s Degree in mathematics from the University of Waterloo.

About the talk

This talk will introduce differential privacy and its use cases, discuss the new component of the TensorFlow Privacy library, and offer real-world scenarios for how to apply the tools. In recent years, the world has become increasingly data-driven and individuals and organizations have developed a stronger awareness and concern for the privacy of their sensitive data.

It has been shown that it is impossible to disclose statistical results about a private database without revealing some information. In fact, the entire database could be recovered from a few query results.

Following research on the privacy of sensitive databases, a number of big players such as Google, Apple, and Uber have turned to differential privacy to help guarantee the privacy of sensitive data. That attention from major technology firms has helped bring differential privacy out of research labs and into the realm of software engineering and product development.

Differential privacy is now something that smaller firms and software startups are adopting and finding great value in. Apart from privacy guarantees, advances in differential privacy also allow businesses to unlock more capabilities and increased data utility.

One of these capabilities includes the ability to transfer knowledge from existing data through differentially private ensemble models without data privacy concerns. As differential privacy garners recognition in large tech companies, efforts to make current state-of-the-art research more accessible to the general public and small startups are underway.

As a contribution to the broader community, Georgian Partners has provided its differential privacy library to the TensorFlow community. Together, we will make differentially private stochastic gradient descent available in a user-friendly and easy-to-use API that allows users to train private logistic regression.

--

--

Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly/write-for-tds