ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.00790
  4. Cited By
Rehearsal-Free Modular and Compositional Continual Learning for Language
  Models

Rehearsal-Free Modular and Compositional Continual Learning for Language Models

31 March 2024
Mingyang Wang
Heike Adel
Lukas Lange
Jannik Strötgen
Hinrich Schütze
    KELM
    CLL
ArXivPDFHTML

Papers citing "Rehearsal-Free Modular and Compositional Continual Learning for Language Models"

4 / 4 papers shown
Title
Achieving Upper Bound Accuracy of Joint Training in Continual Learning
Achieving Upper Bound Accuracy of Joint Training in Continual Learning
Saleh Momeni
Bing Liu
CLL
82
1
0
17 Feb 2025
Orthogonal Subspace Learning for Language Model Continual Learning
Orthogonal Subspace Learning for Language Model Continual Learning
Xiao Wang
Tianze Chen
Qiming Ge
Han Xia
Rong Bao
Rui Zheng
Qi Zhang
Tao Gui
Xuanjing Huang
CLL
112
89
0
22 Oct 2023
NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language
  Selection for Low-Resource Multilingual Sentiment Analysis
NLNDE at SemEval-2023 Task 12: Adaptive Pretraining and Source Language Selection for Low-Resource Multilingual Sentiment Analysis
Mingyang Wang
Heike Adel
Lukas Lange
Jannik Strötgen
Hinrich Schütze
52
15
0
28 Apr 2023
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based
  on Prompt Tuning of T5
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5
Chengwei Qin
Shafiq R. Joty
CLL
170
98
0
14 Oct 2021
1