Unbiased Markov chain Monte Carlo with couplings
Markov chain Monte Carlo (MCMC) methods provide consistent approximations of integrals as the number of iterations goes to infinity. MCMC estimators are generally biased after any fixed number of iterations, which complicates both parallel computation and the construction of confidence intervals. We propose to remove this bias by using couplings of Markov chains together with a telescopic sum argument of Glynn & Rhee (2014). The resulting unbiased estimators can be computed in parallel, with confidence intervals following directly from the Central Limit Theorem for i.i.d. variables. We discuss practical couplings for popular algorithms such as Metropolis-Hastings, Gibbs samplers, and Hamiltonian Monte Carlo. We establish the theoretical validity of the proposed estimators and study their efficiency relative to the underlying MCMC algorithms. Finally, we illustrate the performance and limitations of the method on toy examples, a variable selection problem, and an approximation of the cut distribution arising in Bayesian inference for models made of multiple modules.
View on arXiv