Particle Metropolis adjusted Langevin algorithms

Pseudo-marginal and particle MCMC have recently been introduced as classes of algorithms that can be used to analyse models where the likelihood function is intractable. They use Monte Carlo methods, such as particle filters, to estimate the posterior density, and MCMC moves to update the model parameters. Particle filter algorithms can also produce Monte Carlo estimates of the gradient of the log-posterior which can then be used within the MCMC proposal distribution for the parameters. The resulting particle MCMC algorithm can be viewed as an approximation to the Metropolis adjusted Langevin algorithm, which we call particle MALA. We investigate the theoretical properties of particle MALA under standard asymptotics, which correspond to an increasing dimension of the parameters, n. Our results show that the behaviour of particle MALA depends crucially on how accurately one can estimate the gradient of the log-posterior. If the error in the estimate of the gradient is not controlled sufficiently well as dimension increases, then asymptotically there will be no advantage in using particle MALA over the simpler random-walk proposal. However, if the error is well-behaved, then the optimal scaling of particle MALA proposals will be compared to for the random-walk. Our theory also gives guidelines as to how to tune the number of particles and the step size used within particle MALA.
View on arXiv