ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1312.4620
123
40
v1v2v3v4 (latest)

On posterior concentration in misspecified models

17 December 2013
R. Ramamoorthi
Karthik Sriram
Ryan Martin
ArXiv (abs)PDFHTML
Abstract

We investigate the asymptotic behavior of Bayesian posterior distributions under independent and identically distributed (i.i.d.) misspecified models. More specifically, we study the concentration of the posterior distribution on neighborhoods of f⋆f^\starf⋆, the density that is closest in the Kullback-Leibler sense to the true model f0f_0f0​. We note through examples, the need for assumptions beyond the usual Kullback--Leibler support assumption. We then investigate consistency with respect to a general metric under three assumptions, each based on a notion of divergence measure, and then apply these to a weighted L1L_1L1​ metric in convex models and non-convex models. Although a few results on this topic are available, we believe that these are somewhat inaccessible due, in part, to the technicalities and the subtle differences compared to the more familiar well-specified model case. One of our goals is to make some of the available results, especially that of Kleijn and van der Vaart (2006) more accessible and transparent. Unlike their paper, our approach does not require construction of test sequences. We also discuss a preliminary extension of the i.i.d. results to the independent but not identically distributed (i.n.i.d.) case.

View on arXiv
Comments on this paper