A theoretical basis for model collapse in recursive training
- GAN
Main:4 Pages
Bibliography:1 Pages
Abstract
It is known that recursive training from generative models can lead to the so called `collapse' of the simulated probability distribution. This note shows that one in fact gets two different asymptotic behaviours depending on whether an external source, howsoever minor, is also contributing samples.
View on arXivComments on this paper
