Estimating Discrete Markov Models From Various Incomplete Data Schemes
The parameters of a discrete stationary Markov model are transition probabilities between states. Traditionally, data consists in sequences of observed states for a given number of individuals over the whole observation period. In such a case, the estimation of transition probabilities is straightforwardly made by counting one-step moves from a given state to another. In many real-life problems, however, the inference is much more difficult as state sequences are not fully observed, namely the state of each individual is known only for some given values of the time variable . In this paper we give a review of this field, focusing on Monte Carlo Markov Chain (MCMC) algorithms to perform Bayesian inference and evaluate posterior distributions of the transition probabilities in this missing-data framework. We also propose a way to accelerate the classical Metropolis-Hastings technique for typical reliability problems, taking advantage of the dependence between the matrix rows to build an adaptive MCMC.
View on arXiv