Efficient Sampling on Riemannian Manifolds via Langevin MCMC

We study the task of efficiently sampling from a Gibbs distribution over a Riemannian manifold via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming is Lipschitz and has bounded sectional curvature. Our error bound matches the error of Euclidean Euler-Murayama in terms of its stepsize dependence. Combined with a contraction guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling, we prove that the Langevin MCMC iterates lie within -Wasserstein distance of after steps, which matches the iteration complexity for Euclidean Langevin MCMC. Our results apply in general settings where can be nonconvex and can have negative Ricci curvature. Under additional assumptions that the Riemannian curvature tensor has bounded derivatives, and that satisfies a condition, we analyze the stochastic gradient version of Langevin MCMC, and bound its iteration complexity by as well.
View on arXiv