ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.06289
6
9

Better and Simpler Lower Bounds for Differentially Private Statistical Estimation

10 October 2023
Shyam Narayanan
    FedML
ArXivPDFHTML
Abstract

We provide optimal lower bounds for two well-known parameter estimation (also known as statistical estimation) tasks in high dimensions with approximate differential privacy. First, we prove that for any α≤O(1)\alpha \le O(1)α≤O(1), estimating the covariance of a Gaussian up to spectral error α\alphaα requires Ω~(d3/2αε+dα2)\tilde{\Omega}\left(\frac{d^{3/2}}{\alpha \varepsilon} + \frac{d}{\alpha^2}\right)Ω~(αεd3/2​+α2d​) samples, which is tight up to logarithmic factors. This result improves over previous work which established this for α≤O(1d)\alpha \le O\left(\frac{1}{\sqrt{d}}\right)α≤O(d​1​), and is also simpler than previous work. Next, we prove that estimating the mean of a heavy-tailed distribution with bounded kkkth moments requires Ω~(dαk/(k−1)ε+dα2)\tilde{\Omega}\left(\frac{d}{\alpha^{k/(k-1)} \varepsilon} + \frac{d}{\alpha^2}\right)Ω~(αk/(k−1)εd​+α2d​) samples. Previous work for this problem was only able to establish this lower bound against pure differential privacy, or in the special case of k=2k = 2k=2. Our techniques follow the method of fingerprinting and are generally quite simple. Our lower bound for heavy-tailed estimation is based on a black-box reduction from privately estimating identity-covariance Gaussians. Our lower bound for covariance estimation utilizes a Bayesian approach to show that, under an Inverse Wishart prior distribution for the covariance matrix, no private estimator can be accurate even in expectation, without sufficiently many samples.

View on arXiv
Comments on this paper