ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.14658
74
41
v1v2 (latest)

Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

27 October 2020
Arun Ganesh
Kunal Talwar
    FedML
ArXiv (abs)PDFHTML
Abstract

Various differentially private algorithms instantiate the exponential mechanism, and require sampling from the distribution exp⁡(−f)\exp(-f)exp(−f) for a suitable function fff. When the domain of the distribution is high-dimensional, this sampling can be computationally challenging. Using heuristic sampling schemes such as Gibbs sampling does not necessarily lead to provable privacy. When fff is convex, techniques from log-concave sampling lead to polynomial-time algorithms, albeit with large polynomials. Langevin dynamics-based algorithms offer much faster alternatives under some distance measures such as statistical distance. In this work, we establish rapid convergence for these algorithms under distance measures more suitable for differential privacy. For smooth, strongly-convex fff, we give the first results proving convergence in R\ényi divergence. This gives us fast differentially private algorithms for such fff. Our techniques and simple and generic and apply also to underdamped Langevin dynamics.

View on arXiv
Comments on this paper