215
v1v2v3 (latest)

ODEt_t(ODEl_l): Shortcutting the Time and the Length in Diffusion and Flow Models for Faster Sampling

Main:8 Pages
7 Figures
Bibliography:2 Pages
8 Tables
Appendix:3 Pages
Abstract

Continuous normalizing flows (CNFs) and diffusion models (DMs) generate high-quality data from a noise distribution. However, their sampling process demands multiple iterations to solve an ordinary differential equation (ODE) with high computational complexity. State-of-the-art methods focus on reducing the number of discrete time steps during sampling to improve efficiency. In this work, we explore a complementary direction in which the quality-complexity tradeoff can also be controlled in terms of the neural network length. We achieve this by rewiring the blocks in the transformer-based architecture to solve an inner discretized ODE w.r.t. its depth. Then, we apply a length consistency term during flow matching training, and as a result, the sampling can be performed with an arbitrary number of time steps and transformer blocks. Unlike others, our ODEt_t(ODEl_l) approach is solver-agnostic in time dimension and reduces both latency and, importantly, memory usage. CelebA-HQ and ImageNet generation experiments show a latency reduction of up to 2×2\times in the most efficient sampling mode, and FID improvement of up to 2.82.8 points for high-quality sampling when applied to prior methods. We open-source our code and checkpoints atthis http URL.

View on arXiv
Comments on this paper