Mechanism Design in Large Games: Incentives and Privacy
We study the design of mechanisms satisfying two desiderata--- incentive compatibility and privacy. The first, requires that each agent should be incentivized to report her private information truthfully. The second, privacy, requires the mechanism not reveal `much' about any agent's type to other agents. We propose a notion of privacy we call Joint Differential Privacy. It is a variant of Differential Privacy, a robust notion of privacy used in the Theoretical Computer Science literature. We show by construction that such mechanisms, i.e. ones which are both incentive compatible and jointly differentially private exist when the game is `large', i.e. there are a large number of players, and any player's action affects any other's payoff by at most a small amount. Our mechanism adds carefully selected noise to no-regret algorithms similar to those studied in Foster-Vohra and Hart-Mas-Colell. It therefore implements an approximate correlated equilibrium of the full information game induced by players' reports.
View on arXiv