ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.00678
  4. Cited By
Investigating Catastrophic Forgetting During Continual Training for
  Neural Machine Translation
v1v2v3 (latest)

Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation

International Conference on Computational Linguistics (COLING), 2020
2 November 2020
Shuhao Gu
Yang Feng
    CLL
ArXiv (abs)PDFHTML

Papers citing "Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation"

15 / 15 papers shown
Conditions for Catastrophic Forgetting in Multilingual Translation
Conditions for Catastrophic Forgetting in Multilingual Translation
Danni Liu
Jan Niehues
CLL
209
3
0
22 Oct 2025
Hierarchical-Task-Aware Multi-modal Mixture of Incremental LoRA Experts for Embodied Continual Learning
Hierarchical-Task-Aware Multi-modal Mixture of Incremental LoRA Experts for Embodied Continual LearningAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Ziqi Jia
Anmin Wang
Xiaoyang Qu
Xiaowen Yang
Jianzong Wang
CLL
297
4
0
05 Jun 2025
THOR-MoE: Hierarchical Task-Guided and Context-Responsive Routing for Neural Machine Translation
THOR-MoE: Hierarchical Task-Guided and Context-Responsive Routing for Neural Machine TranslationAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Yunlong Liang
Fandong Meng
Jie Zhou
MoE
250
0
0
20 May 2025
Neural Networks Remember More: The Power of Parameter Isolation and Combination
Neural Networks Remember More: The Power of Parameter Isolation and Combination
Biqing Zeng
Zehan Li
Aladdin Ayesh
CLLKELM
275
0
0
16 Feb 2025
Domain adapted machine translation: What does catastrophic forgetting
  forget and why?
Domain adapted machine translation: What does catastrophic forgetting forget and why?Conference on Empirical Methods in Natural Language Processing (EMNLP), 2024
Danielle Saunders
Steve DeNeefe
AI4CE
161
6
0
23 Dec 2024
Epi-Curriculum: Episodic Curriculum Learning for Low-Resource Domain
  Adaptation in Neural Machine Translation
Epi-Curriculum: Episodic Curriculum Learning for Low-Resource Domain Adaptation in Neural Machine TranslationIEEE Transactions on Artificial Intelligence (IEEE TAI), 2023
Keyu Chen
Zhuang Di
Mingchen Li
J. M. Chang
340
6
0
06 Sep 2023
Adversarial Fine-Tuning of Language Models: An Iterative Optimisation
  Approach for the Generation and Detection of Problematic Content
Adversarial Fine-Tuning of Language Models: An Iterative Optimisation Approach for the Generation and Detection of Problematic Content
Charles OÑeill
Jack Miller
I. Ciucă
Y. Ting 丁
Thang Bui
254
10
0
26 Aug 2023
The Effect of Masking Strategies on Knowledge Retention by Language
  Models
The Effect of Masking Strategies on Knowledge Retention by Language Models
Jonas Wallat
Tianyi Zhang
Avishek Anand
KELMCLL
151
0
0
12 Jun 2023
Efficiently Upgrading Multilingual Machine Translation Models to Support
  More Languages
Efficiently Upgrading Multilingual Machine Translation Models to Support More LanguagesConference of the European Chapter of the Association for Computational Linguistics (EACL), 2023
Simeng Sun
Maha Elbayad
Anna Y. Sun
James Cross
CLLLRM
303
5
0
07 Feb 2023
A Comprehensive Survey of Continual Learning: Theory, Method and
  Application
A Comprehensive Survey of Continual Learning: Theory, Method and ApplicationIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
Liyuan Wang
Xingxing Zhang
Hang Su
Jun Zhu
KELMCLL
954
1,280
0
31 Jan 2023
Continual Learning of Neural Machine Translation within Low Forgetting
  Risk Regions
Continual Learning of Neural Machine Translation within Low Forgetting Risk RegionsConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Shuhao Gu
Bojie Hu
Yang Feng
CLL
321
24
0
03 Nov 2022
Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced
  Training for Neural Machine Translation
Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine TranslationAnnual Meeting of the Association for Computational Linguistics (ACL), 2022
Chenze Shao
Yang Feng
CLL
238
40
0
08 Mar 2022
Cross-Attention is All You Need: Adapting Pretrained Transformers for
  Machine Translation
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine TranslationConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Mozhdeh Gheini
Xiang Ren
Jonathan May
LRM
403
176
0
18 Apr 2021
Domain Adaptation and Multi-Domain Adaptation for Neural Machine
  Translation: A Survey
Domain Adaptation and Multi-Domain Adaptation for Neural Machine Translation: A SurveyJournal of Artificial Intelligence Research (JAIR), 2021
Danielle Saunders
AI4CE
439
112
0
14 Apr 2021
Pruning-then-Expanding Model for Domain Adaptation of Neural Machine
  Translation
Pruning-then-Expanding Model for Domain Adaptation of Neural Machine TranslationNorth American Chapter of the Association for Computational Linguistics (NAACL), 2021
Shuhao Gu
Yang Feng
Wanying Xie
CLLAI4CE
244
33
0
25 Mar 2021
1
Page 1 of 1