ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.06078
22
4

Estimate exponential memory decay in Hidden Markov Model and its applications

17 October 2017
F. Ye
Yian Ma
H. Qian
ArXiv (abs)PDFHTML
Abstract

Inference in hidden Markov model has been challenging in terms of scalability due to dependencies in the observation data. In this paper, we utilize the inherent memory decay in hidden Markov models, such that the forward and backward probabilities can be carried out with subsequences, enabling efficient inference over long sequences of observations. We formulate this forward filtering process in the setting of the random dynamical system and there exist Lyapunov exponents in the i.i.d random matrices production. And the rate of the memory decay is known as λ2−λ1\lambda_2-\lambda_1λ2​−λ1​, the gap of the top two Lyapunov exponents almost surely. An efficient and accurate algorithm is proposed to numerically estimate the gap after the soft-max parametrization. The length of subsequences BBB given the controlled error ϵ\epsilonϵ is B=log⁡(ϵ)/(λ2−λ1)B=\log(\epsilon)/(\lambda_2-\lambda_1)B=log(ϵ)/(λ2​−λ1​). We theoretically prove the validity of the algorithm and demonstrate the effectiveness with numerical examples. The method developed here can be applied to widely used algorithms, such as mini-batch stochastic gradient method. Moreover, the continuity of Lyapunov spectrum ensures the estimated BBB could be reused for the nearby parameter during the inference.

View on arXiv
Comments on this paper