Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion
- DiffM

Continuous diffusion models have demonstrated remarkable performance in data generation across various domains, yet their efficiency remains constrained by two critical limitations: (1) the local adjacency structure of the forward Markov process, which restricts long-range transitions in the data space, and (2) inherent biases introduced during the simulation of time-inhomogeneous reverse denoising processes. To address these challenges, we propose Quantized Transition Diffusion (QTD), a novel approach that integrates data quantization with discrete diffusion dynamics. Our method first transforms the continuous data distribution into a discrete one via histogram approximation and binary encoding, enabling efficient representation in a structured discrete latent space. We then design a continuous-time Markov chain (CTMC) with Hamming distance-based transitions as the forward process, which inherently supports long-range movements in the original data space. For reverse-time sampling, we introduce a \textit{truncated uniformization} technique to simulate the reverse CTMC, which can provably provide unbiased generation from under minimal score assumptions. Through a novel KL dynamic analysis of the reverse CTMC, we prove that QTD can generate samples with score evaluations in expectation to approximate the --dimensional target distribution within an error tolerance. Our method not only establishes state-of-the-art inference efficiency but also advances the theoretical foundations of diffusion-based generative modeling by unifying discrete and continuous diffusion paradigms.
View on arXiv@article{huang2025_2505.21892, title={ Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion }, author={ Xunpeng Huang and Yingyu Lin and Nikki Lijing Kuang and Hanze Dong and Difan Zou and Yian Ma and Tong Zhang }, journal={arXiv preprint arXiv:2505.21892}, year={ 2025 } }