157
245

Order-Optimal Estimation of Functionals of Discrete Distributions

Abstract

We propose a general framework for the construction and analysis of estimators for a wide class of functionals of discrete distributions, where the alphabet size SS is unknown and may be scaling with the number of observations nn. We treat the respective regions where the functional is "nonsmooth" and "smooth" separately. In the "nonsmooth" regime, we apply an unbiased estimator for the best polynomial approximation of the functional whereas, in the "smooth" regime, we apply a bias-corrected version of the Maximum Likelihood Estimator (MLE). We illustrate the merit of this approach by thoroughly analyzing the performance of the resulting schemes for estimating two important information measures: the entropy and the R\'enyi entropy of order α\alpha. We obtain the best known upper bounds for the maximum mean squared error incurred in estimating these functionals. In particular, we demonstrate that our estimator achieves the optimal sample complexity n=Θ(S/lnS)n = \Theta(S/\ln S) for entropy estimation. We also demonstrate that it suffices to have n=ω(S1/α/lnS)n = \omega(S^{1/\alpha}/\ln S) for estimating the R\'enyi entropy of order α\alpha, 0<α<10<\alpha<1. Conversely, we establish a minimax lower bound that establishes optimality of this sample complexity to within a lnS\sqrt{\ln S} factor. We highlight the practical advantages of our schemes for the estimation of entropy and mutual information. We compare our performance with the popular MLE and with the order-optimal entropy estimator of Valiant and Valiant. As we illustrate with a few experiments, our approach results in shorter running time and higher accuracy.

View on arXiv
Comments on this paper