Loading Seminars

« All Seminars

  • This seminar has passed.

Is Differential Privacy a Silver Bullet for Machine Learning?

March 16 (2022) @ 3:00 pm - 4:00 pm

Some machine learning applications involve training data that is sensitive, such as the medical histories of patients in a clinical trial. A model may inadvertently and implicitly store some of its training data; careful analysis of the model may therefore reveal sensitive information. To address this problem, algorithms for private machine learning have been proposed. In this talk, we first show that training neural networks with rigorous privacy guarantees like differential privacy requires rethinking their architectures with the goals of privacy-preserving gradient descent in mind. Second, we explore how private aggregation surfaces the synergies between privacy and generalization in machine learning. Third, we present recent work towards a form of collaborative machine learning that is both privacy-preserving in the sense of differential privacy, and confidentiality-preserving in the sense of the cryptographic community. We motivate the need for this new approach by showing how existing paradigms like federated learning fail to preserve privacy in these settings.

Zoom meeting link: https://newcastleuniversity.zoom.us/j/83414117847?pwd=d01RbWdsWWVLc2pZSHpFZjVBS1IzQT09
Meeting ID: 834 1411 7847
Passcode: 740564

Youtube live streaming: https://youtu.be/cy0KcDCHX34

Details

Date:
March 16 (2022)
Time:
3:00 pm - 4:00 pm
Seminar Tags:
,

Presenter

Nicolas Papernot (University of Toronto)

Nicolas Papernot is an Assistant Professor in the Department of Electrical and Computer Engineering and the Department of Computer Science at the University of Toronto. He is also a faculty member at the Vector Institute where he holds a Canada CIFAR AI Chair, and a faculty affiliate at the Schwartz Reisman Institute. His research interests span the security and privacy of machine learning. Nicolas is a Connaught Researcher and was previously a Google PhD Fellow. His work on differentially private machine learning received a best paper award at ICLR 2017. He is an associate chair of IEEE S&P (Oakland) and an area chair of NeurIPS. He earned his Ph.D. at the Pennsylvania State University, working with Prof. Patrick McDaniel. Upon graduating, he spent a year as a research scientist at Google Brain where he still spends some of his time.

Leave a Reply

Your email address will not be published. Required fields are marked *