Differential privacy (DP) enables private data analysis. In a typical DP deployment, controllers manage individuals' sensitive data and are responsible for answering analysts' queries while protecting individuals' privacy. They do so by choosing the privacy parameter , which controls the degree of privacy for all individuals in all possible datasets. However, it is challenging for controllers to choose because of the difficulty of interpreting the privacy implications of such a choice on the within-dataset individuals.To address this challenge, we first derive a relative disclosure risk indicator (RDR) that indicates the impact of choosing on the within-dataset individuals' disclosure risk. We then design an algorithm to find based on controllers' privacy preferences expressed as a function of the within-dataset individuals' RDRs, and an alternative algorithm that finds and releases while satisfying DP. Lastly, we propose a solution that bounds the total privacy leakage when using the algorithm to answer multiple queries without requiring controllers to set the total privacy budget. We evaluate our contributions through an IRB-approved user study that shows the RDR is useful for helping controllers choose , and experimental evaluations showing our algorithms are efficient and scalable.
View on arXiv@article{zhu2025_2310.13104, title={ Within-Dataset Disclosure Risk for Differential Privacy }, author={ Zhiru Zhu and Raul Castro Fernandez }, journal={arXiv preprint arXiv:2310.13104}, year={ 2025 } }