Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

Abstract
In this paper we consider simple and computationally efficient nonparametric estimators of R\'enyi entropy and mutual information based on an i.i.d. sample drawn from an unknown, absolutely continuous distribution over . Following previous works, the estimators are calculated as the sum of -th powers of the Euclidean lengths of the edges of the `generalized nearest-neighbor' graph of the sample and the empirical copula of the sample respectively. Under mild conditions we prove the almost sure consistency of the estimators. In addition, we derive high probability error bounds assuming that the density underlying the sample is Lipschitz continuous.
View on arXivComments on this paper