Influence Functions for Machine Learning: Nonparametric Estimators for Entropies, Divergences and Mutual Informations
- TDI

Abstract
We propose and analyze estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are based on the theory of influence functions, which appear in the semiparametric statistics literature. Theoretically, we upper bound the rate of convergence for these estimators, showing that they achieve a parametric rate when the densities are sufficiently smooth. We also establish asymptotic normality in this smooth regime under certain regularity conditions. We apply this framework to derive estimators for entropies, divergences and mutual informations and their conditional versions.
View on arXivComments on this paper