A Bayesian Bootstrap for Mixture Models

This paper proposes a new nonparametric Bayesian bootstrap for a mixture model, by developing the traditional Bayesian bootstrap. We first reinterpret the Bayesian bootstrap, which uses the P\ólya-urn scheme, as a gradient ascent algorithm which associated one-step solver. The key then is to use the same basic mechanism as the Bayesian bootstrap with the switch from a point mass kernel to a continuous kernel. Just as the Bayesian bootstrap works solely from the empirical distribution function, so the new Bayesian bootstrap for mixture models works off the nonparametric maximum likelihood estimator for the mixing distribution. From a theoretical perspective, we prove the convergence and exchangeability of the sample sequences from the algorithm and also illustrate our results with different models and settings and some real data.
View on arXiv