ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.04974
30
6

Function-space regularized Rényi divergences

10 October 2022
Jeremiah Birrell
Yannis Pantazis
P. Dupuis
M. Katsoulakis
Luc Rey-Bellet
ArXivPDFHTML
Abstract

We propose a new family of regularized R\ényi divergences parametrized not only by the order α\alphaα but also by a variational function space. These new objects are defined by taking the infimal convolution of the standard R\ényi divergence with the integral probability metric (IPM) associated with the chosen function space. We derive a novel dual variational representation that can be used to construct numerically tractable divergence estimators. This representation avoids risk-sensitive terms and therefore exhibits lower variance, making it well-behaved when α>1\alpha>1α>1; this addresses a notable weakness of prior approaches. We prove several properties of these new divergences, showing that they interpolate between the classical R\ényi divergences and IPMs. We also study the α→∞\alpha\to\inftyα→∞ limit, which leads to a regularized worst-case-regret and a new variational representation in the classical case. Moreover, we show that the proposed regularized R\ényi divergences inherit features from IPMs such as the ability to compare distributions that are not absolutely continuous, e.g., empirical measures and distributions with low-dimensional support. We present numerical results on both synthetic and real datasets, showing the utility of these new divergences in both estimation and GAN training applications; in particular, we demonstrate significantly reduced variance and improved training performance.

View on arXiv
Comments on this paper