ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1407.0381
163
220
v1v2v3 (latest)

Minimax rates of entropy estimation on large alphabets via best polynomial approximation

1 July 2014
Yihong Wu
Pengkun Yang
ArXiv (abs)PDFHTML
Abstract

Consider the problem of estimating the Shannon entropy of a distribution over kkk elements from nnn independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of \Big(\frac{k }{n \log k}\Big)^2 + \frac{\log^2 k}{n} if nnn exceeds a constant factor of klog⁡k\frac{k}{\log k}logkk​; otherwise there exists no consistent estimator. This refines the recent result of Valiant-Valiant \cite{VV11} that the minimal sample size for consistent entropy estimation scales according to Θ(klog⁡k)\Theta(\frac{k}{\log k})Θ(logkk​). The apparatus of best polynomial approximation plays a key role in both the construction of optimal estimators and, via a duality argument, the minimax lower bound.

View on arXiv
Comments on this paper