515

Rates of convergence for density estimation with GANs

Journal of machine learning research (JMLR), 2021
Abstract

We undertake a precise study of the non-asymptotic properties of vanilla generative adversarial networks (GANs) and derive theoretical guarantees in the problem of estimating an unknown dd-dimensional density pp^* under a proper choice of the class of generators and discriminators. We prove that the resulting density estimate converges to pp^* in terms of Jensen-Shannon (JS) divergence at the rate (logn/n)2β/(2β+d)(\log n/n)^{2\beta/(2\beta+d)} where nn is the sample size and β\beta determines the smoothness of p.p^*. This is the first result in the literature on density estimation using vanilla GANs with JS rates faster than n1/2n^{-1/2} in the regime β>d/2.\beta>d/2.

View on arXiv
Comments on this paper