156
358

High-dimensional Bayesian inference via the Unadjusted Langevin Algorithm

Abstract

We consider in this paper the problem of sampling a high-dimensional probability distribution π\pi having a density \wrt\ the Lebesgue measure on Rd\mathbb{R}^d, known up to a normalization factor xπ(x)=eU(x)/RdeU(y)dyx \mapsto \pi(x)= \mathrm{e}^{-U(x)}/\int_{\mathbb{R}^d} \mathrm{e}^{-U(y)} \mathrm{d}y. Such problem naturally occurs for example in Bayesian inference and machine learning. Under the assumption that UU is continuously differentiable, U\nabla U is globally Lipschitz and UU is strongly convex, we obtain non-asymptotic bounds for the convergence to stationarity in Wasserstein distance of order 22 and total variation distance of the sampling method based on the Euler discretization of the Langevin stochastic differential equation, for both constant and decreasing step sizes. The dependence on the dimension of the state space of the obtained bounds is studied to demonstrate the applicability of this method. The convergence of an appropriately weighted empirical measure is also investigated and bounds for the mean square error and exponential deviation inequality are reported for functions which are measurable and bounded. An illustration to Bayesian inference for binary regression is presented.

View on arXiv
Comments on this paper