Nonparametric identification and maximum likelihood estimation for hidden Markov model

Nonparametric identification and maximum likelihood estimation for finite-state hidden Markov models are investigated. We obtain identification of the parameters as well as the order of the Markov chain if the transition probability matrices have full-rank and are ergodic, and if the state-dependent distributions are all distinct, but not necessarily linearly independent. Based on this identification result, we develop nonparametric maximum likelihood estimation theory. First, we show that the asymptotic contrast, the Kullback--Leibler divergence of the hidden Markov model, identifies the true parameter vector nonparametrically as well. Second, for classes of state-dependent densities which are arbitrary mixtures of a parametric family, we show consistency of the nonparametric maximum likelihood estimator. Here, identification of the mixing distributions need not be assumed. Numerical properties of the estimates as well as of nonparametric goodness of fit tests are investigated in a simulation study.
View on arXiv