ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.00199
22
10

Rates of convergence for density estimation with generative adversarial networks

30 January 2021
Nikita Puchkin
S. Samsonov
Denis Belomestny
Eric Moulines
A. Naumov
ArXivPDFHTML
Abstract

In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density p∗\mathsf{p}^*p∗ and the GAN estimate with a significantly better statistical error term compared to the previously known results. The advantage of our bound becomes clear in application to nonparametric density estimation. We show that the JS-divergence between the GAN estimate and p∗\mathsf{p}^*p∗ decays as fast as (log⁡n/n)2β/(2β+d)(\log{n}/n)^{2\beta/(2\beta + d)}(logn/n)2β/(2β+d), where nnn is the sample size and β\betaβ determines the smoothness of p∗\mathsf{p}^*p∗. This rate of convergence coincides (up to logarithmic factors) with minimax optimal for the considered class of densities.

View on arXiv
Comments on this paper