ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1408.1000
154
72
v1v2v3 (latest)

Estimating Renyi Entropy of Discrete Distributions

2 August 2014
Jayadev Acharya
A. Orlitsky
A. Suresh
Himanshu Tyagi
ArXiv (abs)PDFHTML
Abstract

It was recently shown that estimating the Shannon entropy H(p)H({\rm p})H(p) of a discrete kkk-symbol distribution p{\rm p}p requires Θ(k/log⁡k)\Theta(k/\log k)Θ(k/logk) samples, a number that grows near-linearly in the support size. In many applications H(p)H({\rm p})H(p) can be replaced by the more general R\ényi entropy of order α\alphaα, Hα(p)H_\alpha({\rm p})Hα​(p). We determine the number of samples needed to estimate Hα(p)H_\alpha({\rm p})Hα​(p) for all α\alphaα, showing that α<1\alpha < 1α<1 requires a super-linear, roughly k1/αk^{1/\alpha}k1/α samples, noninteger α>1\alpha>1α>1 requires a near-linear kkk samples, but, perhaps surprisingly, integer α>1\alpha>1α>1 requires only Θ(k1−1/α)\Theta(k^{1-1/\alpha})Θ(k1−1/α) samples. Furthermore, developing on a recently established connection between polynomial approximation and estimation of additive functions of the form ∑xf(px)\sum_{x} f({\rm p}_x)∑x​f(px​), we reduce the sample complexity for noninteger values of α\alphaα by a factor of log⁡k\log klogk compared to the empirical estimator. The estimators achieving these bounds are simple and run in time linear in the number of samples. Our lower bounds provide explicit constructions of distributions with different R\ényi entropies that are hard to distinguish.

View on arXiv
Comments on this paper