ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.11778
13
0

Linear and Range Counting under Metric-based Local Differential Privacy

25 September 2019
Zhuolun Xiang
Bolin Ding
Xi He
Jingren Zhou
ArXivPDFHTML
Abstract

Local differential privacy (LDP) enables private data sharing and analytics without the need for a trusted data collector. Error-optimal primitives (for, e.g., estimating means and item frequencies) under LDP have been well studied. For analytical tasks such as range queries, however, the best known error bound is dependent on the domain size of private data, which is potentially prohibitive. This deficiency is inherent as LDP protects the same level of indistinguishability between any pair of private data values for each data downer. In this paper, we utilize an extension of ϵ\epsilonϵ-LDP called Metric-LDP or EEE-LDP, where a metric EEE defines heterogeneous privacy guarantees for different pairs of private data values and thus provides a more flexible knob than ϵ\epsilonϵ does to relax LDP and tune utility-privacy trade-offs. We show that, under such privacy relaxations, for analytical workloads such as linear counting, multi-dimensional range counting queries, and quantile queries, we can achieve significant gains in utility. In particular, for range queries under EEE-LDP where the metric EEE is the L1L^1L1-distance function scaled by ϵ\epsilonϵ, we design mechanisms with errors independent on the domain sizes; instead, their errors depend on the metric EEE, which specifies in what granularity the private data is protected. We believe that the primitives we design for EEE-LDP will be useful in developing mechanisms for other analytical tasks, and encourage the adoption of LDP in practice.

View on arXiv
Comments on this paper