Minimax rates of entropy estimation on large alphabets via best
polynomial approximation
IEEE Transactions on Information Theory (IEEE Trans. Inf. Theory), 2014
Pengkun Yang
Abstract
Consider the problem of estimating the Shannon entropy of a distribution over elements from independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of if exceeds a constant factor of ; otherwise there exists no consistent estimator. This refines the recent result of Valiant-Valiant \cite{VV11} that the minimal sample size for consistent entropy estimation scales according to . The apparatus of best polynomial approximation plays a key role in both the construction of optimal estimators and, via a duality argument, the minimax lower bound.
View on arXivComments on this paper
