ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.05953
61
8
v1v2v3 (latest)

(f,Γ)(f,Γ)(f,Γ)-Divergences: Interpolating between fff-Divergences and Integral Probability Metrics

11 November 2020
Jeremiah Birrell
P. Dupuis
Markos A. Katsoulakis
Yannis Pantazis
Luc Rey-Bellet
ArXiv (abs)PDFHTML
Abstract

We develop a general framework for constructing new information-theoretic divergences that rigorously interpolate between fff-divergences and integral probability metrics (IPMs), such as the Wasserstein distance. These new divergences inherit features from IPMs, such as the ability to compare distributions which are not absolute continuous, as well as from fff-divergences, for instance the strict concavity of their variational representations and the ability to compare heavy-tailed distributions. When combined, these features establish a divergence with improved convergence and estimation properties for statistical learning applications. We demonstrate their use in the training of generative adversarial networks (GAN) for heavy-tailed data and also show they can provide improved performance over gradient-penalized Wasserstein GAN in image generation.

View on arXiv
Comments on this paper