22
0

Improving Consistency Models with Generator-Augmented Flows

Thibaut Issenhuth
Sangchul Lee
Jean-Yves Franceschi
Jean-Yves Franceschi
Chansoo Kim
Alain Rakotomamonjy
Abstract

Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network. They can be learned in two ways: consistency distillation and consistency training. The former relies on the true velocity field of the corresponding differential equation, approximated by a pre-trained neural network. In contrast, the latter uses a single-sample Monte Carlo estimate of this velocity field. The related estimation error induces a discrepancy between consistency distillation and training that, we show, still holds in the continuous-time limit. To alleviate this issue, we propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model. We prove that this flow reduces the previously identified discrepancy and the noise-data transport cost. Consequently, our method not only accelerates consistency training convergence but also enhances its overall performance. The code is available at:this https URL.

View on arXiv
@article{issenhuth2025_2406.09570,
  title={ Improving Consistency Models with Generator-Augmented Flows },
  author={ Thibaut Issenhuth and Sangchul Lee and Ludovic Dos Santos and Jean-Yves Franceschi and Chansoo Kim and Alain Rakotomamonjy },
  journal={arXiv preprint arXiv:2406.09570},
  year={ 2025 }
}
Comments on this paper