363
v1v2v3v4 (latest)

A theoretical basis for model collapse in recursive training

Main:4 Pages
Bibliography:1 Pages
Abstract

It is known that recursive training from generative models can lead to the so called `collapse' of the simulated probability distribution. This note shows that one in fact gets two different asymptotic behaviours depending on whether an external source, howsoever minor, is also contributing samples.

View on arXiv
Comments on this paper