11
17

Fast, Sample-Efficient, Affine-Invariant Private Mean and Covariance Estimation for Subgaussian Distributions

Abstract

We present a fast, differentially private algorithm for high-dimensional covariance-aware mean estimation with nearly optimal sample complexity. Only exponential-time estimators were previously known to achieve this guarantee. Given nn samples from a (sub-)Gaussian distribution with unknown mean μ\mu and covariance Σ\Sigma, our (ε,δ)(\varepsilon,\delta)-differentially private estimator produces μ~\tilde{\mu} such that μμ~Σα\|\mu - \tilde{\mu}\|_{\Sigma} \leq \alpha as long as ndα2+dlog1/δαε+dlog1/δεn \gtrsim \tfrac d {\alpha^2} + \tfrac{d \sqrt{\log 1/\delta}}{\alpha \varepsilon}+\frac{d\log 1/\delta}{\varepsilon}. The Mahalanobis error metric μμ^Σ\|\mu - \hat{\mu}\|_{\Sigma} measures the distance between μ^\hat \mu and μ\mu relative to Σ\Sigma; it characterizes the error of the sample mean. Our algorithm runs in time O~(ndω1+nd/ε)\tilde{O}(nd^{\omega - 1} + nd/\varepsilon), where ω<2.38\omega < 2.38 is the matrix multiplication exponent. We adapt an exponential-time approach of Brown, Gaboardi, Smith, Ullman, and Zakynthinou (2021), giving efficient variants of stable mean and covariance estimation subroutines that also improve the sample complexity to the nearly optimal bound above. Our stable covariance estimator can be turned to private covariance estimation for unrestricted subgaussian distributions. With nd3/2n\gtrsim d^{3/2} samples, our estimate is accurate in spectral norm. This is the first such algorithm using n=o(d2)n= o(d^2) samples, answering an open question posed by Alabi et al. (2022). With nd2n\gtrsim d^2 samples, our estimate is accurate in Frobenius norm. This leads to a fast, nearly optimal algorithm for private learning of unrestricted Gaussian distributions in TV distance. Duchi, Haque, and Kuditipudi (2023) obtained similar results independently and concurrently.

View on arXiv
Comments on this paper