Rates of convergence for density estimation with GANs
Journal of machine learning research (JMLR), 2021
Abstract
We undertake a precise study of the non-asymptotic properties of vanilla generative adversarial networks (GANs) and derive theoretical guarantees in the problem of estimating an unknown -dimensional density under a proper choice of the class of generators and discriminators. We prove that the resulting density estimate converges to in terms of Jensen-Shannon (JS) divergence at the rate where is the sample size and determines the smoothness of This is the first result in the literature on density estimation using vanilla GANs with JS rates faster than in the regime
View on arXivComments on this paper
