ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.13104
8
0

Within-Dataset Disclosure Risk for Differential Privacy

19 October 2023
Zhiru Zhu
Raul Castro Fernandez
ArXivPDFHTML
Abstract

Differential privacy (DP) enables private data analysis. In a typical DP deployment, controllers manage individuals' sensitive data and are responsible for answering analysts' queries while protecting individuals' privacy. They do so by choosing the privacy parameter ϵ\epsilonϵ, which controls the degree of privacy for all individuals in all possible datasets. However, it is challenging for controllers to choose ϵ\epsilonϵ because of the difficulty of interpreting the privacy implications of such a choice on the within-dataset individuals.To address this challenge, we first derive a relative disclosure risk indicator (RDR) that indicates the impact of choosing ϵ\epsilonϵ on the within-dataset individuals' disclosure risk. We then design an algorithm to find ϵ\epsilonϵ based on controllers' privacy preferences expressed as a function of the within-dataset individuals' RDRs, and an alternative algorithm that finds and releases ϵ\epsilonϵ while satisfying DP. Lastly, we propose a solution that bounds the total privacy leakage when using the algorithm to answer multiple queries without requiring controllers to set the total privacy budget. We evaluate our contributions through an IRB-approved user study that shows the RDR is useful for helping controllers choose ϵ\epsilonϵ, and experimental evaluations showing our algorithms are efficient and scalable.

View on arXiv
@article{zhu2025_2310.13104,
  title={ Within-Dataset Disclosure Risk for Differential Privacy },
  author={ Zhiru Zhu and Raul Castro Fernandez },
  journal={arXiv preprint arXiv:2310.13104},
  year={ 2025 }
}
Comments on this paper