ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.14157
29
15

Beyond Autoregression: Discrete Diffusion for Complex Reasoning and Planning

18 October 2024
Jiacheng Ye
Jiahui Gao
Shansan Gong
Lin Zheng
Xin Jiang
Z. Li
Lingpeng Kong
    DiffM
    LRM
ArXivPDFHTML
Abstract

Autoregressive language models, despite their impressive capabilities, struggle with complex reasoning and long-term planning tasks. We introduce discrete diffusion models as a novel solution to these challenges. Through the lens of subgoal imbalance, we demonstrate how diffusion models effectively learn difficult subgoals that elude autoregressive approaches. We propose Multi-Granularity Diffusion Modeling (MGDM), which prioritizes subgoals based on difficulty during learning. On complex tasks like Countdown, Sudoku, and Boolean Satisfiability Problems, MGDM significantly outperforms autoregressive models without using search techniques. For instance, MGDM achieves 91.5\% and 100\% accuracy on Countdown and Sudoku, respectively, compared to 45.8\% and 20.7\% for autoregressive models. Our work highlights the potential of diffusion-based approaches in advancing AI capabilities for sophisticated language understanding and problem-solving tasks. All associated codes are available at \href{this https URL}{this https URL}.

View on arXiv
@article{ye2025_2410.14157,
  title={ Beyond Autoregression: Discrete Diffusion for Complex Reasoning and Planning },
  author={ Jiacheng Ye and Jiahui Gao and Shansan Gong and Lin Zheng and Xin Jiang and Zhenguo Li and Lingpeng Kong },
  journal={arXiv preprint arXiv:2410.14157},
  year={ 2025 }
}
Comments on this paper