82
23
v1v2v3 (latest)

Higher Order Langevin Monte Carlo Algorithm

Sotirios Sabanis
Abstract

A new (unadjusted) Langevin Monte Carlo (LMC) algorithm with improved rates in total variation and in Wasserstein distance is presented. All these are obtained in the context of sampling from a target distribution π\pi that has a density π^\hat{\pi} on Rd\mathbb{R}^d known up to a normalizing constant. Moreover, logπ^-\log \hat{\pi} is assumed to have a locally Lipschitz gradient and its third derivative is locally H\"{o}lder continuous with exponent β(0,1]\beta \in (0,1]. Non-asymptotic bounds are obtained for the convergence to stationarity of the new sampling method with convergence rate 1+β/21+ \beta/2 in Wasserstein distance, while it is shown that the rate is 1 in total variation even in the absence of convexity. Finally, in the case where logπ^-\log \hat{\pi} is strongly convex and its gradient is Lipschitz continuous, explicit constants are provided.

View on arXiv
Comments on this paper