ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.03757
16
16

Discrete Distribution Estimation under User-level Local Differential Privacy

7 November 2022
Jayadev Acharya
Yuhan Liu
Ziteng Sun
ArXivPDFHTML
Abstract

We study discrete distribution estimation under user-level local differential privacy (LDP). In user-level ε\varepsilonε-LDP, each user has m≥1m\ge1m≥1 samples and the privacy of all mmm samples must be preserved simultaneously. We resolve the following dilemma: While on the one hand having more samples per user should provide more information about the underlying distribution, on the other hand, guaranteeing the privacy of all mmm samples should make the estimation task more difficult. We obtain tight bounds for this problem under almost all parameter regimes. Perhaps surprisingly, we show that in suitable parameter regimes, having mmm samples per user is equivalent to having mmm times more users, each with only one sample. Our results demonstrate interesting phase transitions for mmm and the privacy parameter ε\varepsilonε in the estimation risk. Finally, connecting with recent results on shuffled DP, we show that combined with random shuffling, our algorithm leads to optimal error guarantees (up to logarithmic factors) under the central model of user-level DP in certain parameter regimes. We provide several simulations to verify our theoretical findings.

View on arXiv
Comments on this paper