Nearly Minimax Discrete Distribution Estimation in Kullback-Leibler Divergence with High Probability
Main:13 Pages
Bibliography:4 Pages
Appendix:21 Pages
Abstract
We consider the problem of estimating a discrete distribution with support of size and provide both upper and lower bounds with high probability in KL divergence. We prove that in the worst case, for any estimator , with probability at least , $\text{KL}(p \| \widehat{p}) \geq C\max\{K,\ln(K)\ln(1/\delta) \}/n $, where is the sample size and is a constant. We introduce a computationally efficient estimator , based on Online to Batch conversion and suffix averaging, and show that with probability at least .
View on arXivComments on this paper
