ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.09883
  4. Cited By
Beyond Scaling Law: A Data-Efficient Distillation Framework for Reasoning

Beyond Scaling Law: A Data-Efficient Distillation Framework for Reasoning

13 August 2025
Xiaojun Wu
Xiaoguang Jiang
Xue Yang
Jucai Zhai
Dengfeng Liu
Q. Hao
Huang Liu
Zhiguo Yang
Ji Xie
Ninglun Gu
Jin Yang
Kailai Zhang
Yelun Bao
Jun Wang
    LRM
ArXiv (abs)PDFHTML

Papers citing "Beyond Scaling Law: A Data-Efficient Distillation Framework for Reasoning"

3 / 3 papers shown
Title
Revisiting Knowledge Distillation: The Hidden Role of Dataset Size
Revisiting Knowledge Distillation: The Hidden Role of Dataset Size
Giulia Lanzillotta
Felix Sarnthein
Gil Kur
Thomas Hofmann
Bobby He
130
0
0
17 Oct 2025
Detecting Distillation Data from Reasoning Models
Detecting Distillation Data from Reasoning Models
H. Zhang
Hyeong Kyu Choi
Yixuan Li
Hongxin Wei
141
0
0
06 Oct 2025
Merge-of-Thought Distillation
Merge-of-Thought Distillation
Zhanming Shen
Zeyu Qin
Zenan Huang
Hao Chen
J. Hu
Yihong Zhuang
Guoshan Lu
Gang Chen
Junbo Zhao
MoMeLRM
306
1
0
10 Sep 2025
1