ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.00052
  4. Cited By
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

30 April 2020
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
ArXivPDFHTML

Papers citing "MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer"

44 / 94 papers shown
Title
When does Parameter-Efficient Transfer Learning Work for Machine
  Translation?
When does Parameter-Efficient Transfer Learning Work for Machine Translation?
A. Ustun
Asa Cooper Stickland
27
7
0
23 May 2022
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining
  for Task-Oriented Dialog
Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented Dialog
Chia-Chien Hung
Anne Lauscher
Ivan Vulić
Simone Paolo Ponzetto
Goran Glavavs
23
34
0
20 May 2022
Lifting the Curse of Multilinguality by Pre-training Modular
  Transformers
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer
Naman Goyal
Xi Victoria Lin
Xian Li
James Cross
Sebastian Riedel
Mikel Artetxe
LRM
40
138
0
12 May 2022
Parameter-Efficient Tuning by Manipulating Hidden States of Pretrained
  Language Models For Classification Tasks
Parameter-Efficient Tuning by Manipulating Hidden States of Pretrained Language Models For Classification Tasks
Haoran Yang
Piji Li
Wai Lam
23
2
0
10 Apr 2022
IDPG: An Instance-Dependent Prompt Generation Method
IDPG: An Instance-Dependent Prompt Generation Method
Zhuofeng Wu
Sinong Wang
Jiatao Gu
Rui Hou
Yuxiao Dong
V. Vydiswaran
Hao Ma
VLM
30
58
0
09 Apr 2022
Parameter-Efficient Abstractive Question Answering over Tables or Text
Parameter-Efficient Abstractive Question Answering over Tables or Text
Vaishali Pal
Evangelos Kanoulas
Maarten de Rijke
LMTD
19
14
0
07 Apr 2022
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual
  Retrieval
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval
Robert Litschko
Ivan Vulić
Goran Glavavs
LRM
19
13
0
05 Apr 2022
A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity
  Recognition
A Dual-Contrastive Framework for Low-Resource Cross-Lingual Named Entity Recognition
Yingwen Fu
Nankai Lin
Ziyu Yang
Shengyi Jiang
24
4
0
02 Apr 2022
Speaker adaptation for Wav2vec2 based dysarthric ASR
Speaker adaptation for Wav2vec2 based dysarthric ASR
M. Baskar
Tim Herzig
Diana Nguyen
Mireia Díez
Tim Polzehl
L. Burget
J. Černocký
20
28
0
02 Apr 2022
Match the Script, Adapt if Multilingual: Analyzing the Effect of
  Multilingual Pretraining on Cross-lingual Transferability
Match the Script, Adapt if Multilingual: Analyzing the Effect of Multilingual Pretraining on Cross-lingual Transferability
Yoshinari Fujinuma
Jordan L. Boyd-Graber
Katharina Kann
AAML
49
23
0
21 Mar 2022
Geographic Adaptation of Pretrained Language Models
Geographic Adaptation of Pretrained Language Models
Valentin Hofmann
Goran Glavavs
Nikola Ljubevsić
J. Pierrehumbert
Hinrich Schütze
VLM
21
16
0
16 Mar 2022
Hyperdecoders: Instance-specific decoders for multi-task NLP
Hyperdecoders: Instance-specific decoders for multi-task NLP
Hamish Ivison
Matthew E. Peters
AI4CE
24
20
0
15 Mar 2022
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for
  Pre-trained Language Models
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
Ning Ding
Yujia Qin
Guang Yang
Fu Wei
Zonghan Yang
...
Jianfei Chen
Yang Liu
Jie Tang
Juan Li
Maosong Sun
13
196
0
14 Mar 2022
Memory Efficient Continual Learning with Transformers
Memory Efficient Continual Learning with Transformers
B. Ermiş
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
21
42
0
09 Mar 2022
Efficient Adapter Transfer of Self-Supervised Speech Models for
  Automatic Speech Recognition
Efficient Adapter Transfer of Self-Supervised Speech Models for Automatic Speech Recognition
Bethan Thomas
Samuel Kessler
S. Karout
10
70
0
07 Feb 2022
NaijaSenti: A Nigerian Twitter Sentiment Corpus for Multilingual
  Sentiment Analysis
NaijaSenti: A Nigerian Twitter Sentiment Corpus for Multilingual Sentiment Analysis
Shamsuddeen Hassan Muhammad
David Ifeoluwa Adelani
Sebastian Ruder
I. Ahmad
Idris Abdulmumin
...
Chris C. Emezue
Saheed Abdul
Anuoluwapo Aremu
Alipio Jeorge
P. Brazdil
27
95
0
20 Jan 2022
Cascading Adaptors to Leverage English Data to Improve Performance of
  Question Answering for Low-Resource Languages
Cascading Adaptors to Leverage English Data to Improve Performance of Question Answering for Low-Resource Languages
Hariom A. Pandya
Bhavik Ardeshna
Brijesh S. Bhatt
16
6
0
18 Dec 2021
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Efficient Hierarchical Domain Adaptation for Pretrained Language Models
Alexandra Chronopoulou
Matthew E. Peters
Jesse Dodge
23
42
0
16 Dec 2021
Recent Advances in Natural Language Processing via Large Pre-Trained
  Language Models: A Survey
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Bonan Min
Hayley L Ross
Elior Sulem
Amir Pouran Ben Veyseh
Thien Huu Nguyen
Oscar Sainz
Eneko Agirre
Ilana Heinz
Dan Roth
LM&MA
VLM
AI4CE
55
1,029
0
01 Nov 2021
Can Character-based Language Models Improve Downstream Task Performance
  in Low-Resource and Noisy Language Scenarios?
Can Character-based Language Models Improve Downstream Task Performance in Low-Resource and Noisy Language Scenarios?
Arij Riabi
Benoît Sagot
Djamé Seddah
26
15
0
26 Oct 2021
Few-shot Controllable Style Transfer for Low-Resource Multilingual
  Settings
Few-shot Controllable Style Transfer for Low-Resource Multilingual Settings
Kalpesh Krishna
Deepak Nathani
Xavier Garcia
Bidisha Samanta
Partha P. Talukdar
32
24
0
14 Oct 2021
xGQA: Cross-Lingual Visual Question Answering
xGQA: Cross-Lingual Visual Question Answering
Jonas Pfeiffer
Gregor Geigle
Aishwarya Kamath
Jan-Martin O. Steitz
Stefan Roth
Ivan Vulić
Iryna Gurevych
26
56
0
13 Sep 2021
Efficient Test Time Adapter Ensembling for Low-resource Language
  Varieties
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties
Xinyi Wang
Yulia Tsvetkov
Sebastian Ruder
Graham Neubig
30
34
0
10 Sep 2021
MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces
MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces
Srinagesh Sharma
Guoqing Zheng
Ahmed Hassan Awadallah
19
1
0
09 Sep 2021
Sustainable Modular Debiasing of Language Models
Sustainable Modular Debiasing of Language Models
Anne Lauscher
Tobias Lüken
Goran Glavas
47
120
0
08 Sep 2021
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
Nearest Neighbour Few-Shot Learning for Cross-lingual Classification
M Saiful Bari
Batool Haider
Saab Mansour
VLM
11
13
0
06 Sep 2021
Design and Scaffolded Training of an Efficient DNN Operator for Computer
  Vision on the Edge
Design and Scaffolded Training of an Efficient DNN Operator for Computer Vision on the Edge
Vinod Ganesan
Pratyush Kumar
34
2
0
25 Aug 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
35
73
0
01 Jul 2021
Specializing Multilingual Language Models: An Empirical Study
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
23
27
0
16 Jun 2021
Reinforced Iterative Knowledge Distillation for Cross-Lingual Named
  Entity Recognition
Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition
Shining Liang
Ming Gong
J. Pei
Linjun Shou
Wanli Zuo
Xianglin Zuo
Daxin Jiang
31
34
0
01 Jun 2021
Improving the Lexical Ability of Pretrained Language Models for
  Unsupervised Neural Machine Translation
Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation
Alexandra Chronopoulou
Dario Stojanovski
Alexander M. Fraser
SSL
32
26
0
18 Mar 2021
Structural Adapters in Pretrained Language Models for AMR-to-text
  Generation
Structural Adapters in Pretrained Language Models for AMR-to-text Generation
Leonardo F. R. Ribeiro
Yue Zhang
Iryna Gurevych
33
69
0
16 Mar 2021
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual
  Transfer of Vision-Language Models
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
Po-Yao (Bernie) Huang
Mandela Patrick
Junjie Hu
Graham Neubig
Florian Metze
Alexander G. Hauptmann
MLLM
VLM
19
56
0
16 Mar 2021
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language
  Representation
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
J. Clark
Dan Garrette
Iulia Turc
John Wieting
25
210
0
11 Mar 2021
PADA: Example-based Prompt Learning for on-the-fly Adaptation to Unseen
  Domains
PADA: Example-based Prompt Learning for on-the-fly Adaptation to Unseen Domains
Eyal Ben-David
Nadav Oved
Roi Reichart
VLM
OOD
14
87
0
24 Feb 2021
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual
  Natural Language Processing
Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing
Minh Nguyen
Viet Dac Lai
Amir Pouran Ben Veyseh
Thien Huu Nguyen
44
132
0
09 Jan 2021
Learning to Generate Task-Specific Adapters from Task Description
Learning to Generate Task-Specific Adapters from Task Description
Qinyuan Ye
Xiang Ren
107
29
0
02 Jan 2021
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
14
126
0
31 Dec 2020
Emergent Communication Pretraining for Few-Shot Machine Translation
Emergent Communication Pretraining for Few-Shot Machine Translation
Yaoyiran Li
E. Ponti
Ivan Vulić
Anna Korhonen
23
19
0
02 Nov 2020
Rethinking embedding coupling in pre-trained language models
Rethinking embedding coupling in pre-trained language models
Hyung Won Chung
Thibault Févry
Henry Tsai
Melvin Johnson
Sebastian Ruder
93
142
0
24 Oct 2020
AdapterDrop: On the Efficiency of Adapters in Transformers
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rucklé
Gregor Geigle
Max Glockner
Tilman Beck
Jonas Pfeiffer
Nils Reimers
Iryna Gurevych
30
254
0
22 Oct 2020
Efficient Transformers: A Survey
Efficient Transformers: A Survey
Yi Tay
Mostafa Dehghani
Dara Bahri
Donald Metzler
VLM
74
1,098
0
14 Sep 2020
UDapter: Language Adaptation for Truly Universal Dependency Parsing
UDapter: Language Adaptation for Truly Universal Dependency Parsing
A. Ustun
Arianna Bisazza
G. Bouma
Gertjan van Noord
22
113
0
29 Apr 2020
What the [MASK]? Making Sense of Language-Specific BERT Models
What the [MASK]? Making Sense of Language-Specific BERT Models
Debora Nozza
Federico Bianchi
Dirk Hovy
84
105
0
05 Mar 2020
Previous
12