134
126

Efficient multivariate entropy estimation via kk-nearest neighbour distances

Abstract

Many statistical procedures, including goodness-of-fit tests and methods for independent component analysis, rely critically on the estimation of the entropy of a distribution. In this paper, we seek entropy estimators that are efficient in the sense of achieving the local asymptotic minimax lower bound. To this end, we initially study a generalisation of the estimator originally proposed by \citet{Kozachenko:87}, based on the kk-nearest neighbour distances of a sample of nn independent and identically distributed random vectors in Rd\mathbb{R}^d. When d3d \leq 3 and provided k/log5nk/\log^5 n \rightarrow \infty (as well as other regularity conditions), we show that the estimator is efficient; on the other hand, when d4d\geq 4, a non-trivial bias precludes its efficiency regardless of the choice of kk. This motivates us to consider a new entropy estimator, formed as a weighted average of Kozachenko--Leonenko estimators for different values of kk. A careful choice of weights enables us to obtain an efficient estimator in arbitrary dimensions, given sufficient smoothness. In addition to the new estimator proposed and theoretical understanding provided, our results also have other methodological implications; in particular, they motivate the prewhitening of the data before applying the estimator and facilitate the construction of asymptotically valid confidence intervals of asymptotically minimal width.

View on arXiv
Comments on this paper