ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.00234
260
26
v1v2 (latest)

Fast Solvers for Discrete Diffusion Models: Theory and Applications of High-Order Algorithms

1 February 2025
Yinuo Ren
Haoxuan Chen
Yuchen Zhu
Wei Guo
Yongxin Chen
Grant M. Rotskoff
Molei Tao
Lexing Ying
    DiffM
ArXiv (abs)PDFHTMLGithub (5★)
Main:10 Pages
8 Figures
Bibliography:14 Pages
7 Tables
Appendix:24 Pages
Abstract

Discrete diffusion models have emerged as a powerful generative modeling framework for discrete data with successful applications spanning from text generation to image synthesis. However, their deployment faces challenges due to the high dimensionality of the state space, necessitating the development of efficient inference algorithms. Current inference approaches mainly fall into two categories: exact simulation and approximate methods such as τ\tauτ-leaping. While exact methods suffer from unpredictable inference time and redundant function evaluations, τ\tauτ-leaping is limited by its first-order accuracy. In this work, we advance the latter category by tailoring the first extension of high-order numerical inference schemes to discrete diffusion models, enabling larger step sizes while reducing error. We rigorously analyze the proposed schemes and establish the second-order accuracy of the θ\thetaθ-trapezoidal method in KL divergence. Empirical evaluations on GPT-2 level text and ImageNet-level image generation tasks demonstrate that our method achieves superior sample quality compared to existing approaches under equivalent computational constraints.

View on arXiv
Comments on this paper