ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1607.08228
16
72

LightDP: Towards Automating Differential Privacy Proofs

27 July 2016
Danfeng Zhang
Daniel Kifer
ArXivPDFHTML
Abstract

The growing popularity and adoption of differential privacy in academic and industrial settings has resulted in the development of increasingly sophisticated algorithms for releasing information while preserving privacy. Accompanying this phenomenon is the natural rise in the development and publication of incorrect algorithms, thus demonstrating the necessity of formal verification tools. However, existing formal methods for differential privacy face a dilemma: methods based on customized logics can verify sophisticated algorithms but come with a steep learning curve and significant annotation burden on the programmers, while existing programming platforms lack expressive power for some sophisticated algorithms. In this paper, we present LightDP, a simple imperative language that strikes a better balance between expressive power and usability. The core of LightDP is a novel relational type system that separates relational reasoning from privacy budget calculations. With dependent types, the type system is powerful enough to verify sophisticated algorithms where the composition theorem falls short. In addition, the inference engine of LightDP infers most of the proof details, and even searches for the proof with minimal privacy cost bound when multiple proofs exist. We show that LightDP verifies sophisticated algorithms with little manual effort.

View on arXiv
Comments on this paper