12
0

Multi-Step Consistency Models: Fast Generation with Theoretical Guarantees

Abstract

Consistency models have recently emerged as a compelling alternative to traditional SDE based diffusion models, offering a significant acceleration in generation by producing high quality samples in very few steps. Despite their empirical success, a proper theoretic justification for their speed up is still lacking. In this work, we provide the analysis which bridges this gap, showing that given a consistency model which can map the input at a given time to arbitrary timestamps along the reverse trajectory, one can achieve KL divergence of order O(ε2) O(\varepsilon^2) using only O(log(dε)) O\left(\log\left(\frac{d}{\varepsilon}\right)\right) iterations with constant step size, where d is the data dimension. Additionally, under minimal assumptions on the data distribution an increasingly common setting in recent diffusion model analyses we show that a similar KL convergence guarantee can be obtained, with the number of steps scaling as O(dlog(dε)) O\left(d \log\left(\frac{d}{\varepsilon}\right)\right) . Going further, we also provide a theoretical analysis for estimation of such consistency models, concluding that accurate learning is feasible using small discretization steps, both in smooth and non smooth settings. Notably, our results for the non smooth case yield best in class convergence rates compared to existing SDE or ODE based analyses under minimal assumptions.

View on arXiv
@article{jain2025_2505.01049,
  title={ Multi-Step Consistency Models: Fast Generation with Theoretical Guarantees },
  author={ Nishant Jain and Xunpeng Huang and Yian Ma and Tong Zhang },
  journal={arXiv preprint arXiv:2505.01049},
  year={ 2025 }
}
Comments on this paper