504
v1v2v3v4 (latest)

Rates of convergence for density estimation with generative adversarial networks

Journal of machine learning research (JMLR), 2021
Abstract

In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density p\mathsf{p}^* and the GAN estimate with a significantly better statistical error term compared to the previously known results. The advantage of our bound becomes clear in application to nonparametric density estimation. We show that the JS-divergence between the GAN estimate and p\mathsf{p}^* decays as fast as (logn/n)2β/(2β+d)(\log{n}/n)^{2\beta/(2\beta + d)}, where nn is the sample size and β\beta determines the smoothness of p\mathsf{p}^*. This rate of convergence coincides (up to logarithmic factors) with minimax optimal for the considered class of densities.

View on arXiv
Comments on this paper