Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2509.01649
Cited By
Distilled Pretraining: A modern lens of Data, In-Context Learning and Test-Time Scaling
1 September 2025
Sachin Goyal
David Lopez-Paz
Kartik Ahuja
Re-assign community
ArXiv (abs)
PDF
HTML
HuggingFace (2 upvotes)
Papers citing
"Distilled Pretraining: A modern lens of Data, In-Context Learning and Test-Time Scaling"
3 / 3 papers shown
Title
Mode-Conditioning Unlocks Superior Test-Time Scaling
Chen Henry Wu
Sachin Goyal
Aditi Raghunathan
VLM
132
0
0
30 Nov 2025
A Survey on Collaborating Small and Large Language Models for Performance, Cost-effectiveness, Cloud-edge Privacy, and Trustworthiness
Fali Wang
Jihai Chen
Shuhua Yang
Ali Al-Lawati
Linli Tang
Hui Liu
Suhang Wang
159
2
0
14 Oct 2025
Pre-training under infinite compute
Konwoo Kim
Suhas Kotha
Abigail Z. Jacobs
Tatsunori Hashimoto
208
2
0
18 Sep 2025
1