Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo
under local conditions for nonconvex optimization
Journal of machine learning research (JMLR), 2020
Ömer Deniz Akyildiz
Sotirios Sabanis
Abstract
We provide a nonasymptotic analysis of the convergence of the stochastic gradient Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without assuming log-concavity. By making the dimension dependence explicit, we provide a uniform convergence rate of order , where is the step-size. Our results shed light onto the performance of the SGHMC methods compared to their overdamped counterparts, e.g., stochastic gradient Langevin dynamics (SGLD). Furthermore, our results also imply that the SGHMC, when viewed as a nonconvex optimizer, converges to a global minimum with the best known rates.
View on arXivComments on this paper
