In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density and the GAN estimate with a significantly better statistical error term compared to the previously known results. The advantage of our bound becomes clear in application to nonparametric density estimation. We show that the JS-divergence between the GAN estimate and decays as fast as , where is the sample size and determines the smoothness of . This rate of convergence coincides (up to logarithmic factors) with minimax optimal for the considered class of densities.
View on arXiv