We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function on and a sequence of truncation levels satisfying Let denote the maximum likelihood estimate of and let denote the -dimensional vector which -th coordinate is defined by \sqrt{n} (\hat{\theta}_n(i)-\theta_0(i)) for We check that under mild conditions on and on the sequence of prior probabilities on the -dimensional simplices, after centering and rescaling, the variation distance between the posterior distribution recentered around and rescaled by and the -dimensional Gaussian distribution converges in probability to This theorem can be used to prove the asymptotic normality of Bayesian estimators of Shannon and R\'{e}nyi entropies. The proofs are based on concentration inequalities for centered and non-centered Chi-square (Pearson) statistics. The latter allow to establish posterior concentration rates with respect to Fisher distance rather than with respect to the Hellinger distance as it is commonplace in non-parametric Bayesian statistics.
View on arXiv