119
8

Efficient MCMC for parameter inference for Markov jump processes

Abstract

Markov jump processes (MJPs) are continuous-time stochastic processes that find wide application in a variety of disciplines. Inference for MJPs typically proceeds via Markov chain Monte Carlo, the state-of-the-art being an auxiliary variable Gibbs sampler proposed recently. This algorithm was designed for the situation where the MJP parameters are known, and Bayesian inference over unknown parameters is typically carried out by incorporating this into a larger Gibbs sampler. This strategy of alternately sampling parameters given path, and then path given parameters can result in poor Markov chain mixing. In this work, we propose a simple and elegant algorithm to address this problem. Our scheme brings Metropolis-Hastings (MH) approaches for discrete-time hidden Markov models (HMMs) to the continuous-time setting, and also also ties up some of the loose ends in previous work. The result is a complete and clean recipe for parameter and path inference in MJPs. In our experiments, we demonstrate superior performance over the Gibbs sampling approach, as well as other approaches like particle Markov chain Monte Carlo.

View on arXiv
Comments on this paper