691

Generalized Gaussian Mechanism for Differential Privacy

Abstract

Assessment of disclosure risk is of paramount importance in the research and applications of statistical disclosure limitation and data confidentiality. The concept of differential privacy (DP) formalizes privacy in probabilistic terms and provides a robust concept for privacy protection without making assumptions about the background knowledge of data intruders. Practical applications of DP involve development of differentially private mechanisms and algorithms to generate sanitized results at a pre-specified privacy budget, among which the Laplace mechanism is a popular sanitizer. In this paper, we generalize the Laplace mechanism to the family of generalized Gaussian mechanism (GGM) based on the lpl_p global sensitivity (GS) of statistical queries. We present theoretical results on the requirement for the GGM to reach DP at prespecified privacy parameters, and the Gaussian mechanism of (ϵ,δ)(\epsilon,\delta)-probabilistic DP as a special case of the GGM with a new lower bound on the scale parameter to satisfy the specified DP. We also compare the statistical utility of the sanitized results in terms of their dispersion or mean squared error from the Gaussian mechanism and the Laplace mechanism in independent sanitization. Lastly, we investigate the connections and differences between the GGM and the Exponential mechanism based on the generalized Gaussian distribution.

View on arXiv
Comments on this paper