Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.04759
Cited By
What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models
6 April 2024
Busayo Awobade
Mardiyyah Oduwole
Steven Kolawole
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What Happens When Small Is Made Smaller? Exploring the Impact of Compression on Small Data Pretrained Language Models"
5 / 5 papers shown
Title
MasakhaNER 2.0: Africa-centric Transfer Learning for Named Entity Recognition
David Ifeoluwa Adelani
Graham Neubig
Sebastian Ruder
Shruti Rijhwani
Michael Beukman
...
Idris Abdulmumin
Odunayo Ogundepo
Oreen Yousuf
Tatiana Moteu Ngoli
Dietrich Klakow
36
43
0
22 Oct 2022
The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation
Orevaoghene Ahia
Julia Kreutzer
Sara Hooker
107
46
0
06 Oct 2021
Towards Efficient Post-training Quantization of Pre-trained Language Models
Haoli Bai
Lu Hou
Lifeng Shang
Xin Jiang
Irwin King
M. Lyu
MQ
71
47
0
30 Sep 2021
The Lottery Ticket Hypothesis for Pre-trained BERT Networks
Tianlong Chen
Jonathan Frankle
Shiyu Chang
Sijia Liu
Yang Zhang
Zhangyang Wang
Michael Carbin
148
345
0
23 Jul 2020
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
222
382
0
05 Mar 2020
1