Wasserstein GANs are Minimax Optimal Distribution Estimators

We provide non asymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure with support in . The probability is considered to be the push forward of the Lebesgue measure on the -dimensional torus by a map of smoothness . Measuring the error with the -H\"older Integral Probability Metric (IPM), we obtain up to logarithmic factors, the minimax optimal rate where is the sample size, determines the smoothness of the target measure , is the smoothness of the IPM ( is the Wasserstein case) and is the intrinsic dimension of . In the process, we derive a sharp interpolation inequality between H\"older IPMs. This novel result of theory of functions spaces generalizes classical interpolation inequalities to the case where the measures involved have densities on different manifolds.
View on arXiv