146
245

Minimax Estimation of Functionals of Discrete Distributions

Abstract

We propose a general framework for the construction and analysis of minimax estimators for a wide class of functionals of discrete distributions, where the alphabet size SS is unknown and may be scaling with the number of observations nn. We treat the respective regions where the functional is "nonsmooth" and "smooth" separately. In the "nonsmooth" regime, we apply an unbiased estimator for the best polynomial approximation of the functional whereas, in the "smooth" regime, we apply a bias-corrected version of the Maximum Likelihood Estimator (MLE). We illustrate the merit of this approach by thoroughly analyzing the performance of the resulting schemes for estimating two important information measures: the entropy H(P)=i=1SpilnpiH(P) = \sum_{i = 1}^S -p_i \ln p_i and Fα(P)=i=1Spiα,α>0F_\alpha(P) = \sum_{i = 1}^S p_i^\alpha,\alpha>0. We obtain the minimax L2L_2 rates for estimating these functionals. In particular, we demonstrate that our estimator achieves the optimal sample complexity n=Θ(S/lnS)n = \Theta(S/\ln S) for entropy estimation. We also demonstrate that the sample complexity for estimating Fα(P),0<α<1F_\alpha(P),0<\alpha<1 is Θ(S1/α/lnS)\Theta(S^{1/\alpha}/ \ln S), which can be achieved by our estimator but not the MLE. For 1<α<3/21<\alpha<3/2, we show the minimax L2L_2 rate for estimating Fα(P)F_\alpha(P) is (nlnn)2(α1)(n\ln n)^{-2(\alpha-1)} regardless of the alphabet size, while the exact L2L_2 rate for the MLE is n2(α1)n^{-2(\alpha-1)}. For all the above cases, the behavior of the optimal estimators with nn samples is essentially that of the MLE with nlnnn\ln n samples. We highlight the practical advantages of our schemes for the estimation of entropy and mutual information. We compare our performance with the popular MLE and with the order-optimal entropy estimator of Valiant and Valiant. As we illustrate with a few experiments, our approach reduces running time and boosts the accuracy.

View on arXiv
Comments on this paper