47
0

General-Purpose ff-DP Estimation and Auditing in a Black-Box Setting

Abstract

In this paper we propose new methods to statistically assess ff-Differential Privacy (ff-DP), a recent refinement of differential privacy (DP) that remedies certain weaknesses of standard DP (including tightness under algorithmic composition). A challenge when deploying differentially private mechanisms is that DP is hard to validate, especially in the black-box setting. This has led to numerous empirical methods for auditing standard DP, while ff-DP remains less explored. We introduce new black-box methods for ff-DP that, unlike existing approaches for this privacy notion, do not require prior knowledge of the investigated algorithm. Our procedure yields a complete estimate of the ff-DP trade-off curve, with theoretical guarantees of convergence. Additionally, we propose an efficient auditing method that empirically detects ff-DP violations with statistical certainty, merging techniques from non-parametric estimation and optimal classification theory. Through experiments on a range of DP mechanisms, we demonstrate the effectiveness of our estimation and auditing procedures.

View on arXiv
@article{askin2025_2502.07066,
  title={ General-Purpose $f$-DP Estimation and Auditing in a Black-Box Setting },
  author={ Önder Askin and Holger Dette and Martin Dunsche and Tim Kutta and Yun Lu and Yu Wei and Vassilis Zikas },
  journal={arXiv preprint arXiv:2502.07066},
  year={ 2025 }
}
Comments on this paper