Minimax Estimation of Functionals of Discrete Distributions

We propose a general framework for the construction and analysis of minimax estimators for a wide class of functionals of discrete distributions, where the alphabet size is unknown and may be scaling with the number of observations . We treat the respective regions where the functional is "nonsmooth" and "smooth" separately. In the "nonsmooth" regime, we apply an unbiased estimator for the best polynomial approximation of the functional whereas, in the "smooth" regime, we apply a bias-corrected version of the Maximum Likelihood Estimator (MLE). We illustrate the merit of this approach by thoroughly analyzing the performance of the resulting schemes for estimating two important information measures: the entropy and . We obtain the minimax rates for estimating these functionals. In particular, we demonstrate that our estimator achieves the optimal sample complexity for entropy estimation. We also demonstrate that the sample complexity for estimating is , which can be achieved by our estimator but not the MLE. For , we show the minimax rate for estimating is regardless of the alphabet size, while the exact rate for the MLE is . For all the above cases, the behavior of the optimal estimators with samples is essentially that of the MLE with samples. We highlight the practical advantages of our schemes for the estimation of entropy and mutual information. We compare our performance with the popular MLE and with the order-optimal entropy estimator of Valiant and Valiant. As we illustrate with a few experiments, our approach reduces running time and boosts the accuracy.
View on arXiv