ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.12629
58
11

Concentration of the Langevin Algorithm's Stationary Distribution

24 December 2022
Jason M. Altschuler
Kunal Talwar
ArXiv (abs)PDFHTML
Abstract

A canonical algorithm for log-concave sampling is the Langevin Algorithm, aka the Langevin Diffusion run with some discretization stepsize η>0\eta > 0η>0. This discretization leads the Langevin Algorithm to have a stationary distribution πη\pi_{\eta}πη​ which differs from the stationary distribution π\piπ of the Langevin Diffusion, and it is an important challenge to understand whether the well-known properties of π\piπ extend to πη\pi_{\eta}πη​. In particular, while concentration properties such as isoperimetry and rapidly decaying tails are classically known for π\piπ, the analogous properties for πη\pi_{\eta}πη​ are open questions with direct algorithmic implications. This note provides a first step in this direction by establishing concentration results for πη\pi_{\eta}πη​ that mirror classical results for π\piπ. Specifically, we show that for any nontrivial stepsize η>0\eta > 0η>0, πη\pi_{\eta}πη​ is sub-exponential (respectively, sub-Gaussian) when the potential is convex (respectively, strongly convex). Moreover, the concentration bounds we show are essentially tight. Key to our analysis is the use of a rotation-invariant moment generating function (aka Bessel function) to study the stationary dynamics of the Langevin Algorithm. This technique may be of independent interest because it enables directly analyzing the discrete-time stationary distribution πη\pi_{\eta}πη​ without going through the continuous-time stationary distribution π\piπ as an intermediary.

View on arXiv
Comments on this paper