22
4

Efficient Sampling on Riemannian Manifolds via Langevin MCMC

Abstract

We study the task of efficiently sampling from a Gibbs distribution dπ=ehdvolgd \pi^* = e^{-h} d {vol}_g over a Riemannian manifold MM via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming h\nabla h is Lipschitz and MM has bounded sectional curvature. Our error bound matches the error of Euclidean Euler-Murayama in terms of its stepsize dependence. Combined with a contraction guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling, we prove that the Langevin MCMC iterates lie within ϵ\epsilon-Wasserstein distance of π\pi^* after O~(ϵ2)\tilde{O}(\epsilon^{-2}) steps, which matches the iteration complexity for Euclidean Langevin MCMC. Our results apply in general settings where hh can be nonconvex and MM can have negative Ricci curvature. Under additional assumptions that the Riemannian curvature tensor has bounded derivatives, and that π\pi^* satisfies a CD(,)CD(\cdot,\infty) condition, we analyze the stochastic gradient version of Langevin MCMC, and bound its iteration complexity by O~(ϵ2)\tilde{O}(\epsilon^{-2}) as well.

View on arXiv
Comments on this paper