ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.01200
  4. Cited By
Multi-level Distillation of Semantic Knowledge for Pre-training
  Multilingual Language Model

Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model

2 November 2022
Mingqi Li
Fei Ding
Dan Zhang
Long Cheng
Hongxin Hu
Feng Luo
ArXivPDFHTML

Papers citing "Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model"

6 / 6 papers shown
Title
The Zeno's Paradox of `Low-Resource' Languages
The Zeno's Paradox of `Low-Resource' Languages
H. Nigatu
A. Tonja
Benjamin Rosman
Thamar Solorio
Monojit Choudhury
114
5
0
28 Oct 2024
Teaching LLMs to Abstain across Languages via Multilingual Feedback
Teaching LLMs to Abstain across Languages via Multilingual Feedback
Shangbin Feng
Weijia Shi
Yike Wang
Wenxuan Ding
Orevaoghene Ahia
Shuyue Stella Li
Vidhisha Balachandran
Sunayana Sitaram
Yulia Tsvetkov
67
4
0
22 Jun 2024
DE$^3$-BERT: Distance-Enhanced Early Exiting for BERT based on
  Prototypical Networks
DE3^33-BERT: Distance-Enhanced Early Exiting for BERT based on Prototypical Networks
Jianing He
Qi Zhang
Weiping Ding
Duoqian Miao
Jun Zhao
Liang Hu
LongBing Cao
34
3
0
03 Feb 2024
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning
Yixuan Su
Fangyu Liu
Zaiqiao Meng
Tian Lan
Lei Shu
Ehsan Shareghi
Nigel Collier
137
57
0
07 Nov 2021
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual
  Semantics with Monolingual Corpora
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora
Ouyang Xuan
Shuohuan Wang
Chao Pang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
54
100
0
31 Dec 2020
Multilingual BERT Post-Pretraining Alignment
Multilingual BERT Post-Pretraining Alignment
Lin Pan
Chung-Wei Hang
Haode Qi
Abhishek Shah
Saloni Potdar
Mo Yu
103
44
0
23 Oct 2020
1