335

Private, Efficient, and Optimal K-Norm and Elliptic Gaussian Noise For Sum, Count, and Vote

Annual Conference Computational Learning Theory (COLT), 2023
Abstract

Differentially private computation often begins with a bound on some dd-dimensional statistic's p\ell_p sensitivity. For pure differential privacy, the KK-norm mechanism can improve on this approach using statistic-specific (and possibly non-p\ell_p) norms. However, sampling such mechanisms requires sampling from the corresponding norm balls. These are dd-dimensional convex polytopes, for which the fastest known general sampling algorithm takes time O~(d3+ω)\tilde O(d^{3+\omega}), where ω2\omega \geq 2 is the matrix multiplication exponent. For concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise, but the general method for computing the problem-specific elliptic noise requires solving a semidefinite program for each instance. This paper considers the simple problems of sum, count, and vote and provides faster algorithms in both settings. We construct optimal pure differentially private KK-norm mechanism samplers and derive closed-form expressions for optimal concentrated differentially private elliptic Gaussian noise. Their runtimes are, respectively, O~(d2)\tilde O(d^2) and O(1)O(1), and the resulting algorithms all yield meaningful accuracy improvements. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.

View on arXiv
Comments on this paper