Can Stochastic Gradient Langevin Dynamics Provide Differential Privacy for Deep Learning?

Guy Heller, Ethan Fetaya

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Bayesian learning via Stochastic Gradient Langevin Dynamics (SGLD) has been suggested for differentially private learning. While previous research provides differential privacy bounds for SGLD at the initial steps of the algorithm or when close to convergence, the question of what differential privacy guarantees can be made in between remains unanswered. This interim region is of great importance, especially for Bayesian neural networks, as it is hard to guarantee convergence to the posterior. This paper shows that using SGLD might result in unbounded privacy loss for this interim region, even when sampling from the posterior is as differentially private as desired.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE Conference on Secure and Trustworthy Machine Learning, SaTML 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages68-106
Number of pages39
ISBN (Electronic)9781665462990
DOIs
StatePublished - 2023
Event2023 IEEE Conference on Secure and Trustworthy Machine Learning, SaTML 2023 - Raleigh, United States
Duration: 8 Feb 202310 Feb 2023

Publication series

NameProceedings - 2023 IEEE Conference on Secure and Trustworthy Machine Learning, SaTML 2023

Conference

Conference2023 IEEE Conference on Secure and Trustworthy Machine Learning, SaTML 2023
Country/TerritoryUnited States
CityRaleigh
Period8/02/2310/02/23

Keywords

  • Bayesian Inference
  • Deep Learning
  • Differential Privacy
  • Stochastic Gradient Langevin Dynamics

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Safety, Risk, Reliability and Quality
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Can Stochastic Gradient Langevin Dynamics Provide Differential Privacy for Deep Learning?'. Together they form a unique fingerprint.

Cite this