ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.11834
32
10

Asymptotic Analysis of Model Selection Criteria for General Hidden Markov Models

28 November 2018
S. Yonekura
A. Beskos
Sumeetpal S.Singh
ArXivPDFHTML
Abstract

The paper obtains analytical results for the asymptotic properties of Model Selection Criteria -- widely used in practice -- for a general family of hidden Markov models (HMMs), thereby substantially extending the related theory beyond typical i.i.d.-like model structures and filling in an important gap in the relevant literature. In particular, we look at the Bayesian and Akaike Information Criteria (BIC and AIC) and the model evidence. In the setting of nested classes of models, we prove that BIC and the evidence are strongly consistent for HMMs (under regularity conditions), whereas AIC is not weakly consistent. Numerical experiments support our theoretical results.

View on arXiv
Comments on this paper