ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06460
  4. Cited By
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual
  Transfer

Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer

11 December 2020
M. Vidoni
Ivan Vulić
Goran Glavaš
ArXiv (abs)PDFHTML

Papers citing "Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer"

25 / 25 papers shown
How to Tune a Multilingual Encoder Model for Germanic Languages: A Study of PEFT, Full Fine-Tuning, and Language Adapters
How to Tune a Multilingual Encoder Model for Germanic Languages: A Study of PEFT, Full Fine-Tuning, and Language Adapters
Romina Oji
Jenny Kunz
121
3
0
10 Jan 2025
Language and Task Arithmetic with Parameter-Efficient Layers for
  Zero-Shot Summarization
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
Alexandra Chronopoulou
Jonas Pfeiffer
Joshua Maynez
Xinyi Wang
Sebastian Ruder
Priyanka Agrawal
MoMe
258
28
0
15 Nov 2023
Efficient Domain Adaptation of Sentence Embeddings Using Adapters
Efficient Domain Adaptation of Sentence Embeddings Using AdaptersRecent Advances in Natural Language Processing (RANLP), 2023
Tim Schopf
Dennis Schneider
Florian Matthes
653
9
0
06 Jul 2023
Cross-Lingual Transfer with Target Language-Ready Task Adapters
Cross-Lingual Transfer with Target Language-Ready Task AdaptersAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Marinela Parović
Alan Ansell
Ivan Vulić
Anna Korhonen
226
14
0
05 Jun 2023
Distilling Efficient Language-Specific Models for Cross-Lingual Transfer
Distilling Efficient Language-Specific Models for Cross-Lingual TransferAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Alan Ansell
Edoardo Ponti
Anna Korhonen
Ivan Vulić
273
6
0
02 Jun 2023
mmT5: Modular Multilingual Pre-Training Solves Source Language
  Hallucinations
mmT5: Modular Multilingual Pre-Training Solves Source Language HallucinationsConference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Jonas Pfeiffer
Francesco Piccinno
Massimo Nicosia
Xinyi Wang
Machel Reid
Sebastian Ruder
VLMLRM
289
33
0
23 May 2023
Analyzing and Reducing the Performance Gap in Cross-Lingual Transfer
  with Fine-tuning Slow and Fast
Analyzing and Reducing the Performance Gap in Cross-Lingual Transfer with Fine-tuning Slow and FastAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Yiduo Guo
Yaobo Liang
Dongyan Zhao
Yinan Han
Du Nan
CLL
248
3
0
19 May 2023
Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to
  Dialects and Closely-Related Languages
Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to Dialects and Closely-Related LanguagesWorkshop on NLP for Similar Languages, Varieties and Dialects (VarDial), 2023
Aarohi Srivastava
David Chiang
293
12
0
30 Mar 2023
Legal and Political Stance Detection of SCOTUS Language
Legal and Political Stance Detection of SCOTUS Language
Noah Bergam
Emily Allaway
Kathleen McKeown
AILawELM
224
8
0
21 Nov 2022
Inducer-tuning: Connecting Prefix-tuning and Adapter-tuning
Inducer-tuning: Connecting Prefix-tuning and Adapter-tuningConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Yifan Chen
Devamanyu Hazarika
Mahdi Namazifar
Yang Liu
Di Jin
Dilek Z. Hakkani-Tür
215
5
0
26 Oct 2022
Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using
  Multilingual BERT
Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERTInternational Conference on Pattern Recognition (ICPR), 2022
Beiduo Chen
Wu Guo
Quan Liu
Kun Tao
296
3
0
17 May 2022
Lifting the Curse of Multilinguality by Pre-training Modular
  Transformers
Lifting the Curse of Multilinguality by Pre-training Modular TransformersNorth American Chapter of the Association for Computational Linguistics (NAACL), 2022
Jonas Pfeiffer
Naman Goyal
Xi Lin
Xian Li
James Cross
Sebastian Riedel
Mikel Artetxe
LRM
301
169
0
12 May 2022
Cross-Lingual Text Classification with Multilingual Distillation and
  Zero-Shot-Aware Training
Cross-Lingual Text Classification with Multilingual Distillation and Zero-Shot-Aware Training
Ziqing Yang
Yiming Cui
Zhigang Chen
Shijin Wang
VLM
489
5
0
28 Feb 2022
Unsupervised Domain Adaptation with Adapter
Unsupervised Domain Adaptation with Adapter
Rongsheng Zhang
Yinhe Zheng
Xiaoxi Mao
Shiyu Huang
222
20
0
01 Nov 2021
Composable Sparse Fine-Tuning for Cross-Lingual Transfer
Composable Sparse Fine-Tuning for Cross-Lingual Transfer
Alan Ansell
Edoardo Ponti
Anna Korhonen
Ivan Vulić
CLLMoE
397
171
0
14 Oct 2021
xGQA: Cross-Lingual Visual Question Answering
xGQA: Cross-Lingual Visual Question Answering
Jonas Pfeiffer
Gregor Geigle
Aishwarya Kamath
Jan-Martin O. Steitz
Stefan Roth
Ivan Vulić
Iryna Gurevych
473
85
0
13 Sep 2021
Sustainable Modular Debiasing of Language Models
Sustainable Modular Debiasing of Language ModelsConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Anne Lauscher
Tobias Lüken
Goran Glavaš
412
152
0
08 Sep 2021
AdapterHub Playground: Simple and Flexible Few-Shot Learning with
  Adapters
AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters
Tilman Beck
Bela Bohlender
Christina Viehmann
Vincent Hane
Yanik Adamson
Jaber Khuri
Jonas Brossmann
Jonas Pfeiffer
Iryna Gurevych
292
17
0
18 Aug 2021
Target-Oriented Fine-tuning for Zero-Resource Named Entity Recognition
Target-Oriented Fine-tuning for Zero-Resource Named Entity RecognitionFindings (Findings), 2021
Ying Zhang
Fandong Meng
Jinan Xu
Jinan Xu
Jie Zhou
280
11
0
22 Jul 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
258
87
0
01 Jul 2021
Specializing Multilingual Language Models: An Empirical Study
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
520
30
0
16 Jun 2021
What to Pre-Train on? Efficient Intermediate Task Selection
What to Pre-Train on? Efficient Intermediate Task SelectionConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Clifton A. Poth
Jonas Pfeiffer
Andreas Rucklé
Iryna Gurevych
350
108
0
16 Apr 2021
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
UNKs Everywhere: Adapting Multilingual Language Models to New ScriptsConference on Empirical Methods in Natural Language Processing (EMNLP), 2020
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
404
147
0
31 Dec 2020
AdapterDrop: On the Efficiency of Adapters in Transformers
AdapterDrop: On the Efficiency of Adapters in Transformers
Andreas Rucklé
Gregor Geigle
Max Glockner
Tilman Beck
Jonas Pfeiffer
Nils Reimers
Iryna Gurevych
406
313
0
22 Oct 2020
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
AdapterFusion: Non-Destructive Task Composition for Transfer LearningConference of the European Chapter of the Association for Computational Linguistics (EACL), 2020
Jonas Pfeiffer
Aishwarya Kamath
Andreas Rucklé
Dong Wang
Iryna Gurevych
CLLMoMe
1.1K
1,119
0
01 May 2020
1
Page 1 of 1