ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.00924
23
1

Faster Diffusion-based Sampling with Randomized Midpoints: Sequential and Parallel

3 June 2024
Shivam Gupta
Linda Cai
Sitan Chen
ArXivPDFHTML
Abstract

In recent years, there has been a surge of interest in proving discretization bounds for diffusion models. These works show that for essentially any data distribution, one can approximately sample in polynomial time given a sufficiently accurate estimate of its score functions at different noise levels. In this work, we propose a new discretization scheme for diffusion models inspired by Shen and Lee's randomized midpoint method for log-concave sampling~\cite{ShenL19}. We prove that this approach achieves the best known dimension dependence for sampling from arbitrary smooth distributions in total variation distance (O~(d5/12)\widetilde O(d^{5/12})O(d5/12) compared to O~(d)\widetilde O(\sqrt{d})O(d​) from prior work). We also show that our algorithm can be parallelized to run in only O~(log⁡2d)\widetilde O(\log^2 d)O(log2d) parallel rounds, constituting the first provable guarantees for parallel sampling with diffusion models. As a byproduct of our methods, for the well-studied problem of log-concave sampling in total variation distance, we give an algorithm and simple analysis achieving dimension dependence O~(d5/12)\widetilde O(d^{5/12})O(d5/12) compared to O~(d)\widetilde O(\sqrt{d})O(d​) from prior work.

View on arXiv
Comments on this paper