Estimation of Entropy in Constant Space with Improved Sample Complexity

Abstract
Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the entropy of a distribution over an alphabet of size up to additive error by streaming over i.i.d. samples and using only words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to . We conjecture that this is optimal up to factors.
View on arXivComments on this paper