89
482

A statistical framework for differential privacy

Abstract

One goal of statistical privacy research is to construct a data release mechanism that protects individual privacy while preserving information content. An example is a {\em random mechanism} that takes an input database XX and outputs a random database ZZ according to a distribution Qn(X)Q_n(\cdot|X). {\em Differential privacy} is a particular privacy requirement developed by computer scientists in which Qn(X)Q_n(\cdot |X) is required to be insensitive to changes in one data point in XX. This makes it difficult to infer from ZZ whether a given individual is in the original database XX. We consider differential privacy from a statistical perspective. We consider several data release mechanisms that satisfy the differential privacy requirement. We show that it is useful to compare these schemes by computing the rate of convergence of distributions and densities constructed from the released data. We study a general privacy method, called the exponential mechanism, introduced by McSherry and Talwar (2007). We show that the accuracy of this method is intimately linked to the rate at which the probability that the empirical distribution concentrates in a small ball around the true distribution.

View on arXiv
Comments on this paper