Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.17467
Cited By
CMR Scaling Law: Predicting Critical Mixture Ratios for Continual Pre-training of Language Models
24 July 2024
Jiawei Gu
Zacc Yang
Chuanghao Ding
Rui Zhao
Fei Tan
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CMR Scaling Law: Predicting Critical Mixture Ratios for Continual Pre-training of Language Models"
3 / 3 papers shown
Title
Selecting Large Language Model to Fine-tune via Rectified Scaling Law
Haowei Lin
Baizhou Huang
Haotian Ye
Qinyu Chen
Zihao Wang
Sujian Li
Jianzhu Ma
Xiaojun Wan
James Y. Zou
Yitao Liang
82
20
0
04 Feb 2024
What Makes Pre-trained Language Models Better Zero-shot Learners?
Jinghui Lu
Dongsheng Zhu
Weidong Han
Rui Zhao
Brian Mac Namee
Fei Tan
34
21
0
30 Sep 2022
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
1