61

Can independent Metropolis beat crude Monte Carlo?

Abstract

Assume that we would like to estimate the expected value of a function FF with respect to a density π\pi. We prove that if π\pi is close enough under KL divergence to another density qq, an independent Metropolis sampler estimator that obtains samples from π\pi with proposal density qq, enriched with a variance reduction computational strategy based on control variates, achieves smaller asymptotic variance than that of the crude Monte Carlo estimator. The control variates construction requires no extra computational effort but assumes that the expected value of FF under qq is analytically available. We illustrate this result by calculating the marginal likelihood in a linear regression model with prior-likelihood conflict and a non-conjugate prior. Furthermore, we propose an adaptive independent Metropolis algorithm that adapts the proposal density such that its KL divergence with the target is being reduced. We demonstrate its applicability in a Bayesian logistic and Gaussian process regression problems and we rigorously justify our asymptotic arguments under easily verifiable and essentially minimal conditions.

View on arXiv
Comments on this paper