ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.14218
  4. Cited By
Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models
  via Continual Learning
v1v2 (latest)

Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning

29 April 2020
Zihan Liu
Genta Indra Winata
Andrea Madotto
Pascale Fung
    CLL
ArXiv (abs)PDFHTML

Papers citing "Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning"

11 / 11 papers shown
HOP to the Next Tasks and Domains for Continual Learning in NLP
HOP to the Next Tasks and Domains for Continual Learning in NLP
Umberto Michieli
Mete Ozay
VLM
335
3
0
28 Feb 2024
Overcoming Catastrophic Forgetting in Massively Multilingual Continual
  Learning
Overcoming Catastrophic Forgetting in Massively Multilingual Continual LearningAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Genta Indra Winata
Lingjue Xie
Karthik Radhakrishnan
Shijie Wu
Xisen Jin
Pengxiang Cheng
Mayank Kulkarni
Daniel Preoţiuc-Pietro
CLL
280
32
0
25 May 2023
Memorization of Named Entities in Fine-tuned BERT Models
Memorization of Named Entities in Fine-tuned BERT ModelsInternational Cross-Domain Conference on Machine Learning and Knowledge Extraction (CD-MAKE), 2022
Andor Diera
N. Lell
Aygul Garifullina
A. Scherp
239
2
0
07 Dec 2022
Continual Training of Language Models for Few-Shot Learning
Continual Training of Language Models for Few-Shot LearningConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Zixuan Ke
Haowei Lin
Yijia Shao
Hu Xu
Lei Shu
Yinan Han
KELMBDLCLL
353
42
0
11 Oct 2022
Memory Efficient Continual Learning with Transformers
Memory Efficient Continual Learning with TransformersNeural Information Processing Systems (NeurIPS), 2022
Beyza Ermis
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
334
63
0
09 Mar 2022
Achieving Forgetting Prevention and Knowledge Transfer in Continual
  Learning
Achieving Forgetting Prevention and Knowledge Transfer in Continual LearningNeural Information Processing Systems (NeurIPS), 2021
Zixuan Ke
Bing-Quan Liu
Nianzu Ma
Hu Xu
Lei Shu
CLL
464
150
0
05 Dec 2021
Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual
  Transfer
Soft Layer Selection with Meta-Learning for Zero-Shot Cross-Lingual Transfer
Weijia Xu
Batool Haider
Jason Krone
Saab Mansour
225
7
0
21 Jul 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
252
87
0
01 Jul 2021
Towards Zero-Shot Multilingual Synthetic Question and Answer Generation
  for Cross-Lingual Reading Comprehension
Towards Zero-Shot Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension
Siamak Shakeri
Noah Constant
Mihir Kale
Linting Xue
SyDa
417
29
0
22 Oct 2020
mT5: A massively multilingual pre-trained text-to-text transformer
mT5: A massively multilingual pre-trained text-to-text transformer
Linting Xue
Noah Constant
Adam Roberts
Mihir Kale
Rami Al-Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
895
3,091
0
22 Oct 2020
Cross-lingual Spoken Language Understanding with Regularized
  Representation Alignment
Cross-lingual Spoken Language Understanding with Regularized Representation Alignment
Zihan Liu
Genta Indra Winata
Peng Xu
Mohammad Kachuee
Pascale Fung
310
15
0
30 Sep 2020
1
Page 1 of 1