144
147

Privacy Games: Optimal Protection Mechanism Design for Bayesian and Differential Privacy

Abstract

Perturbing information is an effective approach to protect privacy. However, privacy of users and utility of obfuscated information are at odds with each other. In this paper, we propose a methodology for designing protection mechanisms that optimally trade utility for privacy. We formulate the optimization problem of maximizing user's utility and guaranteeing her privacy as a non zero-sum Stackelberg game. The defender (user) leads the game by designing and committing to a protection mechanism, and the adversary follows by making inference on the shared information. The solution of this game is optimal against any possible inference attack. We show that these games can be solved using linear programming. Our second contribution is to design optimal protection mechanisms using differential privacy metric. We find the values of epsilon that maximize privacy under utility constraints. Inversely, we design mechanisms that optimize utility for a given value of epsilon, as the bound on privacy. For a generic distance function between secrets, we design these optimal mechanisms for differential privacy using linear and quadratic programming. The Bayesian and differential privacy metrics complement each other, as the former measures the absolute privacy level of user due to a protection mechanism, and the latter measures the relative information leakage due to observation from the protection mechanism. Our third contribution is to combine the two notions. We design optimal obfuscation mechanisms that guarantee both Bayesian and differential privacy. Our work fills the gap between Bayesian and differential privacy, and is the first work that unifies different privacy metrics and provides a methodology to design optimal protection mechanisms in a generic case. We show that our optimal joint Bayesian-differential mechanism is indeed superior to the two mechanisms individually.

View on arXiv
Comments on this paper