Informed Correctors for Discrete Diffusion Models

Discrete diffusion has emerged as a powerful framework for generative modeling in discrete domains, yet efficiently sampling from these models remains challenging. Existing sampling strategies often struggle to balance computation and sample quality when the number of sampling steps is reduced, even when the model has learned the data distribution well. To address these limitations, we propose a predictor-corrector sampling scheme where the corrector is informed by the diffusion model to more reliably counter the accumulating approximation errors. To further enhance the effectiveness of our informed corrector, we introduce complementary architectural modifications based on hollow transformers and a simple tailored training objective that leverages more training signal. We use a synthetic example to illustrate the failure modes of existing samplers and show how informed correctors alleviate these problems. On tokenized ImageNet 256x256, this approach consistently produces superior samples with fewer steps, achieving improved FID scores for discrete diffusion models. These results underscore the potential of informed correctors for fast and high-fidelity generation using discrete diffusion.
View on arXiv@article{zhao2025_2407.21243, title={ Informed Correctors for Discrete Diffusion Models }, author={ Yixiu Zhao and Jiaxin Shi and Feng Chen and Shaul Druckmann and Lester Mackey and Scott Linderman }, journal={arXiv preprint arXiv:2407.21243}, year={ 2025 } }