34
0

Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion

Main:11 Pages
4 Figures
Bibliography:3 Pages
3 Tables
Appendix:23 Pages
Abstract

Continuous diffusion models have demonstrated remarkable performance in data generation across various domains, yet their efficiency remains constrained by two critical limitations: (1) the local adjacency structure of the forward Markov process, which restricts long-range transitions in the data space, and (2) inherent biases introduced during the simulation of time-inhomogeneous reverse denoising processes. To address these challenges, we propose Quantized Transition Diffusion (QTD), a novel approach that integrates data quantization with discrete diffusion dynamics. Our method first transforms the continuous data distribution pp_* into a discrete one qq_* via histogram approximation and binary encoding, enabling efficient representation in a structured discrete latent space. We then design a continuous-time Markov chain (CTMC) with Hamming distance-based transitions as the forward process, which inherently supports long-range movements in the original data space. For reverse-time sampling, we introduce a \textit{truncated uniformization} technique to simulate the reverse CTMC, which can provably provide unbiased generation from qq_* under minimal score assumptions. Through a novel KL dynamic analysis of the reverse CTMC, we prove that QTD can generate samples with O(dln2(d/ϵ))O(d\ln^2(d/\epsilon)) score evaluations in expectation to approximate the dd--dimensional target distribution pp_* within an ϵ\epsilon error tolerance. Our method not only establishes state-of-the-art inference efficiency but also advances the theoretical foundations of diffusion-based generative modeling by unifying discrete and continuous diffusion paradigms.

View on arXiv
@article{huang2025_2505.21892,
  title={ Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion },
  author={ Xunpeng Huang and Yingyu Lin and Nikki Lijing Kuang and Hanze Dong and Difan Zou and Yian Ma and Tong Zhang },
  journal={arXiv preprint arXiv:2505.21892},
  year={ 2025 }
}
Comments on this paper