Underdamped Langevin MCMC with third order convergence
In this paper, we propose a new numerical method for the underdamped Langevin diffusion (ULD) and present a non-asymptotic analysis of its sampling error in the 2-Wasserstein distance when the -dimensional target distribution is strongly log-concave and has varying degrees of smoothness. Precisely, under the assumptions that the gradient and Hessian of are Lipschitz continuous, our algorithm achieves a 2-Wasserstein error of in and steps respectively. Therefore, our algorithm has a similar complexity as other popular Langevin MCMC algorithms under matching assumptions. However, if we additionally assume that the third derivative of is Lipschitz continuous, then our algorithm achieves a 2-Wasserstein error of in steps. To the best of our knowledge, this is the first gradient-only method for ULD with third order convergence. To support our theory, we perform Bayesian logistic regression across a range of real-world datasets, where our algorithm achieves competitive performance compared to an existing underdamped Langevin MCMC algorithm and the popular No U-Turn Sampler (NUTS).
View on arXiv