Old Techniques in Differentially Private Linear Regression

Research output: Contribution to journalConference articlepeer-review

Abstract

We introduce three novel differentially private algorithms that approximate the 2nd-moment matrix of the data. These algorithms, which in contrast to existing algorithms always output positive-definite matrices, correspond to existing techniques in linear regression literature. Thus these techniques have an immediate interpretation and all results known about these techniques are straight-forwardly applicable to the outputs of these algorithms. More specifically, we discuss the following three techniques. (i) For Ridge Regression, we propose setting the regularization coefficient so that by approximating the solution using Johnson-Lindenstrauss transform we preserve privacy. (ii) We show that adding a batch of d + O(∊2) random samples to our data preserves differential privacy. (iii) We show that sampling the 2nd-moment matrix from a Bayesian posterior inverse-Wishart distribution is differentially private. We also give utility bounds for our algorithms and compare them with the existing “Analyze Gauss” algorithm of Dwork et al (2014).

Original languageEnglish
Pages (from-to)789-827
Number of pages39
JournalProceedings of Machine Learning Research
Volume98
StatePublished - 2019
Externally publishedYes
Event30th International Conference on Algorithmic Learning Theory, ALT 2019 - Chicago, United States
Duration: 22 Mar 201924 Mar 2019
https://proceedings.mlr.press/v98

Keywords

  • Differential Privacy
  • Linear Regression
  • Second-Moment Matrix
  • Wishart Distribution

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Old Techniques in Differentially Private Linear Regression'. Together they form a unique fingerprint.

Cite this