118
178

Proximal Markov chain Monte Carlo algorithms

Abstract

This paper presents two new Langevin Markov chain Monte Carlo methods that use convex analysis to simulate efficiently from high-dimensional densities that are log-concave, a class of probability distributions that is widely used in modern high-dimensional statistics and data analysis. The methods are based on a new first-order approximation for Langevin diffusions that exploits log-concavity to construct Markov chains with favourable convergence properties. This approximation is closely related to Moreau-Yoshida regularisations for convex functions and uses proximity mappings instead of gradient mappings to approximate the continuous-time process. The first method presented in this paper is an unadjusted Langevin algorithm for log-concave distributions. The method is shown to be stable and geometrically ergodic in many cases for which the conventional unadjusted Langevin algorithm is transient or explosive. The second method is a new Metropolis adjusted Langevin algorithm that complements the unadjusted algorithm with a Metropolis-Hastings step guaranteeing convergence to the desired target density. It is shown that this method inherits the convergence properties of the unadjusted algorithm and is geometrically ergodic in many cases for which the conventional Metropolis adjusted Langevin algorithm is not. The capacity of methods to handle efficiently high-dimensional models is demonstrated empirically on several challenging signal processing and regression problems related to audio compressive sensing, low-rank matrix estimation and image resolution enhancement that are not well addressed by existing MCMC methodology.

View on arXiv
Comments on this paper