Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.05609
Cited By
Load What You Need: Smaller Versions of Multilingual BERT
12 October 2020
Amine Abdaoui
Camille Pradel
Grégoire Sigel
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Load What You Need: Smaller Versions of Multilingual BERT"
8 / 8 papers shown
Title
The Ups and Downs of Large Language Model Inference with Vocabulary Trimming by Language Heuristics
Nikolay Bogoychev
Pinzhen Chen
Barry Haddow
Alexandra Birch
28
0
0
16 Nov 2023
An Efficient Multilingual Language Model Compression through Vocabulary Trimming
Asahi Ushio
Yi Zhou
Jose Camacho-Collados
41
7
0
24 May 2023
idT5: Indonesian Version of Multilingual T5 Transformer
Mukhlish Fuadi
A. Wibawa
S. Sumpeno
11
6
0
02 Feb 2023
Benchmarking Transformers-based models on French Spoken Language Understanding tasks
Oralie Cattan
Sahar Ghannay
Christophe Servan
Sophie Rosset
28
4
0
19 Jul 2022
You Are What You Write: Preserving Privacy in the Era of Large Language Models
Richard Plant
V. Giuffrida
Dimitra Gkatzia
PILM
20
19
0
20 Apr 2022
TextPruner: A Model Pruning Toolkit for Pre-Trained Language Models
Ziqing Yang
Yiming Cui
Zhigang Chen
SyDa
VLM
17
12
0
30 Mar 2022
DziriBERT: a Pre-trained Language Model for the Algerian Dialect
Amine Abdaoui
Mohamed Berrimi
Mourad Oussalah
A. Moussaoui
32
43
0
25 Sep 2021
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
225
575
0
12 Sep 2019
1