Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
All Papers
0 / 0 papers shown
Title
Home
Papers
2508.09883
Cited By
Beyond Scaling Law: A Data-Efficient Distillation Framework for Reasoning
13 August 2025
Xiaojun Wu
Xiaoguang Jiang
Xue Yang
Jucai Zhai
Dengfeng Liu
Q. Hao
Huang Liu
Zhiguo Yang
Ji Xie
Ninglun Gu
Jin Yang
Kailai Zhang
Yelun Bao
Jun Wang
LRM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Beyond Scaling Law: A Data-Efficient Distillation Framework for Reasoning"
3 / 3 papers shown
Title
Revisiting Knowledge Distillation: The Hidden Role of Dataset Size
Giulia Lanzillotta
Felix Sarnthein
Gil Kur
Thomas Hofmann
Bobby He
126
0
0
17 Oct 2025
Detecting Distillation Data from Reasoning Models
H. Zhang
Hyeong Kyu Choi
Yixuan Li
Hongxin Wei
141
0
0
06 Oct 2025
Merge-of-Thought Distillation
Zhanming Shen
Zeyu Qin
Zenan Huang
Hao Chen
J. Hu
Yihong Zhuang
Guoshan Lu
Gang Chen
Junbo Zhao
MoMe
LRM
306
1
0
10 Sep 2025
1