Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2412.05149
Cited By
Findings of the Second BabyLM Challenge: Sample-Efficient Pretraining on Developmentally Plausible Corpora
6 December 2024
Michael Y. Hu
Aaron Mueller
Candace Ross
Adina Williams
Tal Linzen
Chengxu Zhuang
Ryan Cotterell
Leshem Choshen
Alex Warstadt
Ethan Gotlieb Wilcox
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Findings of the Second BabyLM Challenge: Sample-Efficient Pretraining on Developmentally Plausible Corpora"
3 / 3 papers shown
Title
The potential -- and the pitfalls -- of using pre-trained language models as cognitive science theories
Raj Sanjay Shah
Sashank Varma
LRM
89
0
0
22 Jan 2025
AntLM: Bridging Causal and Masked Language Models
Xinru Yu
Bin Guo
Shiwei Luo
J. Wang
Tao Ji
Yuanbin Wu
CLL
77
1
0
04 Dec 2024
Natural Language Processing RELIES on Linguistics
Juri Opitz
Shira Wein
Nathan Schneider
AI4CE
52
7
0
09 May 2024
1