ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.07668
198
1
v1v2 (latest)

Fast sampling and model selection for Bayesian mixture models

13 January 2025
M. E. J. Newman
    TPM
ArXiv (abs)PDFHTML
Main:22 Pages
5 Figures
Bibliography:4 Pages
3 Tables
Appendix:14 Pages
Abstract

We study Bayesian estimation of mixture models and argue in favor of fitting the marginal posterior distribution over component assignments directly, rather than Gibbs sampling from the joint posterior on components and parameters as is commonly done. Some previous authors have found the former approach to have slow mixing, but we show that, implemented correctly, it can achieve excellent performance. In particular, we describe a new Monte Carlo algorithm for sampling from the marginal posterior of a general integrable mixture that makes use of rejection-free sampling from the prior over component assignments to achieve excellent mixing times in typical applications, outperforming standard Gibbs sampling, in some cases by a wide margin. We demonstrate the approach with a selection of applications to Gaussian, Poisson, and categorical models.

View on arXiv
Comments on this paper