ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.11053
53
0
v1v2v3v4v5 (latest)

SRA-MCTS: Self-driven Reasoning Aurmentation with Monte Carlo Tree Search for Enhanced Code Generation

17 November 2024
Bin Xu
Yiguan Lin
Yinghao Li
Yang Gao
    LRM
ArXiv (abs)PDFHTML
Abstract

Large language models demonstrate exceptional performance in simple code generation tasks but still face challenges in tackling complex problems. These challenges may stem from insufficient reasoning and problem decomposition capabilities. To address this issue, we propose a reasoning-augmented data generation process, SRA-MCTS, which guides the model to autonomously generate high-quality intermediate reasoning paths. This creates a positive feedback loop, enabling continuous improvement. Our method operates entirely through the model itself without requiring additional supervision. By synthesizing natural language reasoning paths and translating them into executable code, the approach ensures analytical accuracy and enhances the success rate in solving complex tasks. Experimental results show that, even without additional supervisory signals, our method achieves performance improvements across different model scales, demonstrating the significant potential of self-improvement in small models. Furthermore, the method remains robust when traditional Chain-of-Thought (CoT) approaches exhibit performance degradation, with notable improvements observed in diversity metrics such as pass@10. We encourage further exploration of reasoning processes within training data to enhance the ability of language models to address complex problems.

View on arXiv
@article{xu2025_2411.11053,
  title={ SRA-MCTS: Self-driven Reasoning Augmentation with Monte Carlo Tree Search for Code Generation },
  author={ Bin Xu and Yiguan Lin and Yinghao Li and Yang Gao },
  journal={arXiv preprint arXiv:2411.11053},
  year={ 2025 }
}
Main:7 Pages
6 Figures
Bibliography:2 Pages
3 Tables
Comments on this paper