ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0901.1342
167
151
v1v2v3v4v5 (latest)

Dynamics of Bayesian Updating with Dependent Data and Misspecified Models

11 January 2009
C. Shalizi
ArXiv (abs)PDFHTML
Abstract

Recent work on the convergence of posterior distributions under Bayesian updating has established conditions under which the posterior will concentrate on the truth, if the latter is has a perfect representation within the support of the prior, and under various dynamical assumptions, such as the data-generating process being independent and identically distributed or a Markov process. Here I establish sufficient conditions for the convergence of the posterior distribution in non-parametric problems even when {\em all} of the hypotheses are wrong, and the data-generating process has a complicated dependence structure. The main dynamical assumption is the generalized asymptotic equipartition (or "Shannon-McMillan-Breiman") property of information theory. I derive a kind of large deviations principle for the posterior measure, and discuss the advantages of predicting using a combination of models known to be wrong. An appendix sketches connections between the present results and the "replicator dynamics" of evolutionary theory.

View on arXiv
Comments on this paper