54

On Forgetting and Stability of Score-based Generative models

Stanislas Strasman
Gabriel Cardoso
Sylvain Le Corff
Vincent Lemaire
Antonio Ocello
Main:8 Pages
6 Figures
Bibliography:3 Pages
1 Tables
Appendix:35 Pages
Abstract

Understanding the stability and long-time behavior of generative models is a fundamental problem in modern machine learning. This paper provides quantitative bounds on the sampling error of score-based generative models by leveraging stability and forgetting properties of the Markov chain associated with the reverse-time dynamics. Under weak assumptions, we provide the two structural properties to ensure the propagation of initialization and discretization errors of the backward process: a Lyapunov drift condition and a Doeblin-type minorization condition. A practical consequence is quantitative stability of the sampling procedure, as the reverse diffusion dynamics induces a contraction mechanism along the sampling trajectory. Our results clarify the role of stochastic dynamics in score-based models and provide a principled framework for analyzing propagation of errors in such approaches.

View on arXiv
Comments on this paper