On general sampling schemes for Particle Markov chain Monte Carlo methods

Particle Markov Chain Monte Carlo methods (PMCMC) are used to carry out inference in non-linear and non-Gaussian state space models, where the posterior density of the states is approximated using particles. Current approaches usually carry out Bayesian inference using a particle Marginal Metropolis-Hastings algorithm (PMMH), a particle Gibbs sampler (PG), or a particle Metropolis within Gibbs sampler (PMwG). Our article gives a general approach for constructing sampling schemes that converge to target distributions given in the literature. Our approach shows how the three ways of generating variables mentioned above can be combined flexibly. The advantage of our general approach is that the sampling scheme can be tailored to obtain good results for different applications. We investigate the properties of the general sampling scheme, including conditions for uniform convergence to the posterior. We illustrate our methods with examples of state space models where one group of parameters can be generated in a straightforward manner in a PG step by conditioning on the states, and a second group of parameters are generated without conditioning on the states because of the high dependence between such parameters and the states.
View on arXiv