Minimax rates of entropy estimation on large alphabets via best polynomial approximation
Pengkun Yang

Abstract
Consider the problem of estimating the Shannon entropy of a distribution on elements from independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of \Big(\frac{k }{n \log k}\Big)^2 + \frac{\log^2 k}{n} as long as grows no faster than a polynomial of . This implies the recent result of Valiant-Valiant \cite{VV11} that the minimal sample size for consistent entropy estimation scales according to . The apparatus of best polynomial approximation plays a key role in both the minimax lower bound and the construction of optimal estimators.
View on arXivComments on this paper