Minimax rates of entropy estimation on large alphabets via best
polynomial approximation
IEEE Transactions on Information Theory (IEEE Trans. Inf. Theory), 2014
Pengkun Yang
Abstract
Consider the problem of estimating the Shannon entropy of a distribution on elements from independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of as long as grows no faster than a polynomial of . This implies the recent result of Valiant-Valiant \cite{VV11} that the minimal sample size for consistent entropy estimation scales according to . The apparatus of best polynomial approximation plays a key role in both the minimax lower bound and the construction of optimal estimators.
View on arXivComments on this paper
