97
20

Influence Functions for Machine Learning: Nonparametric Estimators for Entropies, Divergences and Mutual Informations

Abstract

We propose and analyze estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are based on the theory of influence functions, which appear in the semiparametric statistics literature. In our analysis we upper bound the rate of convergence for these estimators, showing that they achieve the parametric rate when the densities are sufficiently smooth. We also establish asymptotic normality in this smooth regime under certain regularity conditions. We apply this framework to derive estimators for several popular information theoretic quantities.

View on arXiv
Comments on this paper