Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.06214
Cited By
[Call for Papers] The 2nd BabyLM Challenge: Sample-efficient pretraining on a developmentally plausible corpus
9 April 2024
Leshem Choshen
Ryan Cotterell
Michael Y. Hu
Tal Linzen
Aaron Mueller
Candace Ross
Alex Warstadt
Ethan Gotlieb Wilcox
Adina Williams
Chengxu Zhuang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"[Call for Papers] The 2nd BabyLM Challenge: Sample-efficient pretraining on a developmentally plausible corpus"
6 / 6 papers shown
Title
Pretraining Language Models for Diachronic Linguistic Change Discovery
Elisabeth Fittschen
Sabrina Li
Tom Lippincott
Leshem Choshen
Craig Messner
26
0
0
07 Apr 2025
BERTtime Stories: Investigating the Role of Synthetic Story Data in Language Pre-training
Nikitas Theodoropoulos
Giorgos Filandrianos
Vassilis Lyberatos
Maria Lymperaiou
Giorgos Stamou
SyDa
52
1
0
24 Feb 2025
BabyLM Turns 3: Call for papers for the 2025 BabyLM workshop
Lucas Charpentier
Leshem Choshen
Ryan Cotterell
Mustafa Omer Gul
Michael Y. Hu
...
Candace Ross
Raj Sanjay Shah
Alex Warstadt
Ethan Gotlieb Wilcox
Adina Williams
47
2
0
15 Feb 2025
A Distributional Perspective on Word Learning in Neural Language Models
Filippo Ficarra
Ryan Cotterell
Alex Warstadt
46
1
0
09 Feb 2025
GPT or BERT: why not both?
Lucas Georges Gabriel Charpentier
David Samuel
47
5
0
31 Dec 2024
BabyLM Challenge: Exploring the Effect of Variation Sets on Language Model Training Efficiency
Akari Haga
Akiyo Fukatsu
Miyu Oba
Arianna Bisazza
Yohei Oseki
27
1
0
14 Nov 2024
1