Private Mean Estimation of Heavy-Tailed Distributions
Annual Conference Computational Learning Theory (COLT), 2020
Abstract
We give new upper and lower bounds on the minimax sample complexity of differentially private mean estimation of distributions with bounded -th moments. Roughly speaking, in the univariate case, we show that samples are necessary and sufficient to estimate the mean to -accuracy under -differential privacy, or any of its common relaxations. This result demonstrates a qualitatively different behavior compared to estimation absent privacy constraints, for which the sample complexity is identical for all . We also give algorithms for the multivariate setting whose sample complexity is a factor of larger than the univariate case.
View on arXivComments on this paper
