157
v1v2v3 (latest)

Private Mean Estimation with Person-Level Differential Privacy

Main:1 Pages
3 Figures
Appendix:71 Pages
Abstract

We study person-level differentially private (DP) mean estimation in the case where each person holds multiple samples. DP here requires the usual notion of distributional stability when all\textit{all} of a person's datapoints can be modified. Informally, if nn people each have mm samples from an unknown dd-dimensional distribution with bounded kk-th moments, we show that \[n = \tilde \Theta\left(\frac{d}{\alpha^2 m} + \frac{d}{\alpha m^{1/2} \varepsilon} + \frac{d}{\alpha^{k/(k-1)} m \varepsilon} + \frac{d}{\varepsilon}\right)\] people are necessary and sufficient to estimate the mean up to distance α\alpha in 2\ell_2-norm under ε\varepsilon-differential privacy (and its common relaxations). In the multivariate setting, we give computationally efficient algorithms under approximate-DP and computationally inefficient algorithms under pure DP, and our nearly matching lower bounds hold for the most permissive case of approximate DP. Our computationally efficient estimators are based on the standard clip-and-noise framework, but the analysis for our setting requires both new algorithmic techniques and new analyses. In particular, our new bounds on the tails of sums of independent, vector-valued, bounded-moments random variables may be of interest.

View on arXiv
Comments on this paper