Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.13013
Cited By
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too
26 May 2020
Jason Phang
Iacer Calixto
Phu Mon Htut
Yada Pruksachatkun
Haokun Liu
Clara Vania
Katharina Kann
Samuel R. Bowman
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too"
10 / 10 papers shown
Title
When to Use Multi-Task Learning vs Intermediate Fine-Tuning for Pre-Trained Encoder Transfer Learning
Orion Weller
Kevin Seppi
Matt Gardner
8
21
0
17 May 2022
Improving In-Context Few-Shot Learning via Self-Supervised Training
Mingda Chen
Jingfei Du
Ramakanth Pasunuru
Todor Mihaylov
Srini Iyer
Ves Stoyanov
Zornitsa Kozareva
SSL
AI4MH
27
63
0
03 May 2022
Cross-Lingual Ability of Multilingual Masked Language Models: A Study of Language Structure
Yuan Chai
Yaobo Liang
Nan Duan
LRM
17
21
0
16 Mar 2022
On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning
Marc Tanti
Lonneke van der Plas
Claudia Borg
Albert Gatt
13
10
0
14 Sep 2021
Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous Graph
Nuttapong Chairatanakul
Noppayut Sriwatanasakdi
Nontawat Charoenphakdee
Xin Liu
T. Murata
16
4
0
09 Sep 2021
Rethinking Why Intermediate-Task Fine-Tuning Works
Ting-Yun Chang
Chi-Jen Lu
LRM
19
29
0
26 Aug 2021
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
J. Clark
Dan Garrette
Iulia Turc
John Wieting
13
210
0
11 Mar 2021
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
93
142
0
24 Oct 2020
FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding
Yuwei Fang
Shuohang Wang
Zhe Gan
S. Sun
Jingjing Liu
VLM
13
58
0
10 Sep 2020
MLQA: Evaluating Cross-lingual Extractive Question Answering
Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
ELM
242
490
0
16 Oct 2019
1