18
1

Faster Diffusion-based Sampling with Randomized Midpoints: Sequential and Parallel

Abstract

In recent years, there has been a surge of interest in proving discretization bounds for diffusion models. These works show that for essentially any data distribution, one can approximately sample in polynomial time given a sufficiently accurate estimate of its score functions at different noise levels. In this work, we propose a new discretization scheme for diffusion models inspired by Shen and Lee's randomized midpoint method for log-concave sampling~\cite{ShenL19}. We prove that this approach achieves the best known dimension dependence for sampling from arbitrary smooth distributions in total variation distance (O~(d5/12)\widetilde O(d^{5/12}) compared to O~(d)\widetilde O(\sqrt{d}) from prior work). We also show that our algorithm can be parallelized to run in only O~(log2d)\widetilde O(\log^2 d) parallel rounds, constituting the first provable guarantees for parallel sampling with diffusion models. As a byproduct of our methods, for the well-studied problem of log-concave sampling in total variation distance, we give an algorithm and simple analysis achieving dimension dependence O~(d5/12)\widetilde O(d^{5/12}) compared to O~(d)\widetilde O(\sqrt{d}) from prior work.

View on arXiv
Comments on this paper