Transfer Learning In Differential Privacy's Hybrid-Model

Refael Kohen, Or Sheffet

Research output: Contribution to journalConference articlepeer-review

Abstract

The hybrid-model (Avent et al., 2017) in Differential Privacy is a an augmentation of the local-model where in addition to N local-agents we are assisted by one special agent who is in fact a curator holding the sensitive details of n additional individuals. Here we study the problem of machine learning in the hybrid-model where the n individuals in the curator's dataset are drawn from a different distribution than the one of the general population (the local-agents). We give a general scheme - Subsample-Test-Reweigh - for this transfer learning problem, which reduces any curator-model DP-learner to a hybrid-model learner in this setting using iterative subsampling and reweighing of the n examples held by the curator based on a smooth variation of the Multiplicative-Weights algorithm (introduced by Bun et al. (2020)). Our scheme has a sample complexity which relies on the χ2-divergence between the two distributions. We give worst-case analysis bounds on the sample complexity required for our private reduction. Aiming to reduce said sample complexity, we give two specific instances our sample complexity can be drastically reduced (one instance is analyzed mathematically, while the other - empirically) and pose several directions for follow-up work.

Original languageEnglish
Pages (from-to)11413-11429
Number of pages17
JournalProceedings of Machine Learning Research
Volume162
StatePublished - 2022
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: 17 Jul 202223 Jul 2022
https://proceedings.mlr.press/v162/

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Transfer Learning In Differential Privacy's Hybrid-Model'. Together they form a unique fingerprint.

Cite this