DIFFERENTIALLY PRIVATE ORDINARY LEAST SQUARES

Research output: Contribution to journalArticlepeer-review

Abstract

. Linear regression is one of the most prevalent techniques in machine learning; however, it is also common to use linear regression for its explanatory capabilities rather than label prediction. Ordinary Least Squares (OLS) is often used in statistics to establish a correlation between an attribute (e.g. gender) and a label (e.g. income) in the presence of other (potentially correlated) features. OLS assumes a particular model that randomly generates the data, and derives t-values — representing the likelihood of each real value to be the true correlation. Using t-values, OLS can release a confidence interval, which is an interval on the reals that is likely to contain the true correlation; and when this interval does not intersect the origin, we can reject the null hypothesis as it is likely that the true correlation is non-zero. Our work aims at achieving similar guarantees on data under differentially private estimators. First, we show that for well-spread data, the Gaussian Johnson-Lindenstrauss Transform (JLT) gives a very good approximation of t-values; secondly, when JLT approximates Ridge regression (linear regression with ℓ2-regularization) we derive, under certain conditions, confidence intervals using the projected data; lastly, we derive, under different conditions, confidence intervals for the “Analyze Gauss” algorithm [14].

Original languageEnglish
JournalJournal of Privacy and Confidentiality
Volume9
Issue number1 Special Issue
DOIs
StatePublished - 31 Mar 2019
Externally publishedYes

Keywords

  • Differential Privacy
  • Ordinary Least Squares
  • p-Value
  • t-Value

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Statistics and Probability
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'DIFFERENTIALLY PRIVATE ORDINARY LEAST SQUARES'. Together they form a unique fingerprint.

Cite this