ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1411.4342
90
20
v1v2v3 (latest)

Influence Functions for Machine Learning: Nonparametric Estimators for Entropies, Divergences and Mutual Informations

17 November 2014
Kirthevasan Kandasamy
A. Krishnamurthy
Barnabás Póczós
Larry A. Wasserman
J. M. Robins
    TDI
ArXiv (abs)PDFHTML
Abstract

We propose and analyze estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are based on the theory of influence functions, which appear in the semiparametric statistics literature. In our analysis we upper bound the rate of convergence for these estimators, showing that they achieve the parametric rate when the densities are sufficiently smooth. We also establish asymptotic normality in this smooth regime under certain regularity conditions. We apply this framework to derive estimators for several popular information theoretic quantities.

View on arXiv
Comments on this paper