Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.01200
Cited By
Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model
2 November 2022
Mingqi Li
Fei Ding
Dan Zhang
Long Cheng
Hongxin Hu
Feng Luo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model"
6 / 6 papers shown
Title
The Zeno's Paradox of `Low-Resource' Languages
H. Nigatu
A. Tonja
Benjamin Rosman
Thamar Solorio
Monojit Choudhury
114
5
0
28 Oct 2024
Teaching LLMs to Abstain across Languages via Multilingual Feedback
Shangbin Feng
Weijia Shi
Yike Wang
Wenxuan Ding
Orevaoghene Ahia
Shuyue Stella Li
Vidhisha Balachandran
Sunayana Sitaram
Yulia Tsvetkov
67
4
0
22 Jun 2024
DE
3
^3
3
-BERT: Distance-Enhanced Early Exiting for BERT based on Prototypical Networks
Jianing He
Qi Zhang
Weiping Ding
Duoqian Miao
Jun Zhao
Liang Hu
LongBing Cao
34
3
0
03 Feb 2024
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning
Yixuan Su
Fangyu Liu
Zaiqiao Meng
Tian Lan
Lei Shu
Ehsan Shareghi
Nigel Collier
137
57
0
07 Nov 2021
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora
Ouyang Xuan
Shuohuan Wang
Chao Pang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
54
100
0
31 Dec 2020
Multilingual BERT Post-Pretraining Alignment
Lin Pan
Chung-Wei Hang
Haode Qi
Abhishek Shah
Saloni Potdar
Mo Yu
103
44
0
23 Oct 2020
1