A Bernstein-Von Mises Theorem for discrete probability distributions
We investigate the asymptotic normality of the the posterior distribution in the discrete case, when model dimension increases with sample size. We consider a probability mass function on and a sequence of trunction levels satisfying Then, under some mild conditions on and on the sequence of prior probabilities on the -dimensional simplices. Let denote the maximum likelihood estimate of and . We check that after centering and rescaling, the variation distance between the posterior distribution and the Gaussian distribution converges in probability to This theorem can be used to prove the asymptotic normality of some Bayesian estimators of the Shannon and R\'{e}nyi entropies. The proofs are based on concentration inequalities for centered and non-centered Chi-square (Pearson) statistics. The latter allow to establish posterior concentration rates with respect to Fisher distance rather than with respect to the Hellinger distance as it is commonplace in non-parametric Bayesian statistics.
View on arXiv