ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.18574
  4. Cited By
SIKeD: Self-guided Iterative Knowledge Distillation for mathematical
  reasoning

SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoning

Annual Meeting of the Association for Computational Linguistics (ACL), 2024
24 October 2024
Shivam Adarsh
Kumar Shridhar
Caglar Gulcehre
Nicholas Monath
Mrinmaya Sachan
    LRM
ArXiv (abs)PDFHTMLGithub (4★)

Papers citing "SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoning"

3 / 3 papers shown
CAC-CoT: Connector-Aware Compact Chain-of-Thought for Efficient Reasoning Data Synthesis Across Dual-System Cognitive Tasks
CAC-CoT: Connector-Aware Compact Chain-of-Thought for Efficient Reasoning Data Synthesis Across Dual-System Cognitive Tasks
Sunguk Choi
Yonghoon Kwon
Heondeuk Lee
ReLMLRM
125
0
0
26 Aug 2025
A Survey on Large Language Models for Mathematical Reasoning
Peng-Yuan Wang
Tian-Shuo Liu
Chenyang Wang
Yi-Di Wang
Shu Yan
...
Xu-Hui Liu
Xin-Wei Chen
Jia-Cheng Xu
Ziniu Li
Yang Yu
LRM
272
18
0
10 Jun 2025
UNDO: Understanding Distillation as Optimization
UNDO: Understanding Distillation as Optimization
Kushal Kumar Jain
Piyushi Goyal
Kumar Shridhar
251
0
0
03 Apr 2025
1