Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.02171
Cited By
nmT5 -- Is parallel data still relevant for pre-training massively multilingual language models?
3 June 2021
Mihir Kale
Aditya Siddhant
Noah Constant
Melvin Johnson
Rami Al-Rfou
Linting Xue
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"nmT5 -- Is parallel data still relevant for pre-training massively multilingual language models?"
5 / 5 papers shown
Title
The Interpreter Understands Your Meaning: End-to-end Spoken Language Understanding Aided by Speech Translation
Mutian He
Philip N. Garner
36
4
0
16 May 2023
On the Role of Parallel Data in Cross-lingual Transfer Learning
Machel Reid
Mikel Artetxe
21
10
0
20 Dec 2022
Advancing Multilingual Pre-training: TRIP Triangular Document-level Pre-training for Multilingual Language Models
Hongyuan Lu
Haoyang Huang
Shuming Ma
Dongdong Zhang
W. Lam
Furu Wei
22
4
0
15 Dec 2022
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
Machel Reid
Mikel Artetxe
VLM
42
26
0
04 Aug 2021
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora
Ouyang Xuan
Shuohuan Wang
Chao Pang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
54
100
0
31 Dec 2020
1