ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.12631
  4. Cited By
Efficient Continual Learning with Modular Networks and Task-Driven
  Priors
v1v2 (latest)

Efficient Continual Learning with Modular Networks and Task-Driven Priors

International Conference on Learning Representations (ICLR), 2020
23 December 2020
Tom Véniat
Ludovic Denoyer
MarcÁurelio Ranzato
    CLL
ArXiv (abs)PDFHTML

Papers citing "Efficient Continual Learning with Modular Networks and Task-Driven Priors"

50 / 54 papers shown
Model Recycling Framework for Multi-Source Data-Free Supervised Transfer Learning
Model Recycling Framework for Multi-Source Data-Free Supervised Transfer Learning
Sijia Wang
Ricardo Henao
226
0
0
04 Aug 2025
GRID: Scalable Task-Agnostic Prompt-Based Continual Learning for Language Models
GRID: Scalable Task-Agnostic Prompt-Based Continual Learning for Language Models
Anushka Tiwari
Sayantan Pal
Rohini Srihari
Kaiyi Ji
CLLVLM
286
0
0
19 Jul 2025
Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer
Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer
Zhenrong Liu
Janne M. J. Huttunen
Mikko Honkala
CLL
383
1
0
13 May 2025
Parameter-Efficient Continual Fine-Tuning: A Survey
Parameter-Efficient Continual Fine-Tuning: A Survey
Eric Nuertey Coleman
Luigi Quarantiello
Ziyue Liu
Qinwen Yang
Samrat Mukherjee
J. Hurtado
Vincenzo Lomonaco
CLL
472
8
0
18 Apr 2025
Studying Cross-cluster Modularity in Neural Networks
Studying Cross-cluster Modularity in Neural Networks
Satvik Golechha
Maheep Chaudhary
Joan Velja
Alessandro Abate
Nandi Schoots
441
0
0
04 Feb 2025
Hierarchical Subspaces of Policies for Continual Offline Reinforcement Learning
Hierarchical Subspaces of Policies for Continual Offline Reinforcement Learning
Anthony Kobanda
Rémy Portelas
Odalric-Ambrym Maillard
Ludovic Denoyer
OffRLCLL
773
2
0
19 Dec 2024
RECAST: Reparameterized, Compact weight Adaptation for Sequential Tasks
RECAST: Reparameterized, Compact weight Adaptation for Sequential TasksInternational Conference on Learning Representations (ICLR), 2024
Nazia Tasnim
Bryan A. Plummer
CLLOffRL
552
2
0
25 Nov 2024
Self-Expansion of Pre-trained Models with Mixture of Adapters for Continual Learning
Self-Expansion of Pre-trained Models with Mixture of Adapters for Continual Learning
Huiyi Wang
Haodong Lu
Lina Yao
Dong Gong
KELMCLL
445
43
0
27 Mar 2024
Towards Redundancy-Free Sub-networks in Continual Learning
Towards Redundancy-Free Sub-networks in Continual Learning
Cheng Chen
Jingkuan Song
LianLi Gao
Hengtao Shen
214
2
0
01 Dec 2023
Continual Referring Expression Comprehension via Dual Modular
  Memorization
Continual Referring Expression Comprehension via Dual Modular MemorizationIEEE Transactions on Image Processing (IEEE TIP), 2022
Hengtao Shen
Cheng Chen
Peng Wang
Lianli Gao
Ming Wang
Jingkuan Song
ObjD
189
7
0
25 Nov 2023
Class Gradient Projection For Continual Learning
Class Gradient Projection For Continual LearningACM Multimedia (ACM MM), 2022
Cheng Chen
Ji Zhang
Jingkuan Song
Lianli Gao
CLL
213
22
0
25 Nov 2023
What Can AutoML Do For Continual Learning?
What Can AutoML Do For Continual Learning?
Mert Kilickaya
Joaquin Vanschoren
239
2
0
20 Nov 2023
Visually Grounded Continual Language Learning with Selective
  Specialization
Visually Grounded Continual Language Learning with Selective SpecializationConference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Kyra Ahrens
Lennart Bengtson
Jae Hee Lee
Stefan Wermter
376
2
0
24 Oct 2023
Towards Robust and Efficient Continual Language Learning
Towards Robust and Efficient Continual Language Learning
Adam Fisch
Amal Rannen-Triki
Razvan Pascanu
J. Bornschein
Angeliki Lazaridou
E. Gribovskaya
MarcÁurelio Ranzato
CLL
237
3
0
11 Jul 2023
Maintaining Plasticity in Deep Continual Learning
Maintaining Plasticity in Deep Continual Learning
Shibhansh Dohare
J. F. Hernandez-Garcia
Parash Rahman
A. Rupam Mahmood
Richard S. Sutton
KELMCLL
488
40
0
23 Jun 2023
Studying Generalization on Memory-Based Methods in Continual Learning
Studying Generalization on Memory-Based Methods in Continual Learning
Felipe del-Rio
J. Hurtado
Cristian-Radu Buc
Alvaro Soto
Vincenzo Lomonaco
BDL
285
3
0
16 Jun 2023
Neural Sculpting: Uncovering hierarchically modular task structure in
  neural networks through pruning and network analysis
Neural Sculpting: Uncovering hierarchically modular task structure in neural networks through pruning and network analysisNeural Information Processing Systems (NeurIPS), 2023
S. M. Patil
Loizos Michael
C. Dovrolis
284
0
0
28 May 2023
SketchOGD: Memory-Efficient Continual Learning
SketchOGD: Memory-Efficient Continual Learning
Benjamin Wright
Youngjae Min
Jeremy Bernstein
Navid Azizan
CLL
395
2
0
25 May 2023
Task Difficulty Aware Parameter Allocation & Regularization for Lifelong
  Learning
Task Difficulty Aware Parameter Allocation & Regularization for Lifelong LearningComputer Vision and Pattern Recognition (CVPR), 2023
Wenjin Wang
Yunqing Hu
Qianglong Chen
Yin Zhang
CLL
286
19
0
11 Apr 2023
Meta-Album: Multi-domain Meta-Dataset for Few-Shot Image Classification
Meta-Album: Multi-domain Meta-Dataset for Few-Shot Image ClassificationNeural Information Processing Systems (NeurIPS), 2023
I. Ullah
Dustin Carrión-Ojeda
Sergio Escalera
Isabelle M Guyon
Mike Huisman
F. Mohr
Jan N van Rijn
Haozhe Sun
Joaquin Vanschoren
P. Vu
VLM
280
47
0
16 Feb 2023
Task-Aware Information Routing from Common Representation Space in
  Lifelong Learning
Task-Aware Information Routing from Common Representation Space in Lifelong LearningInternational Conference on Learning Representations (ICLR), 2023
Prashant Shivaram Bhat
Bahram Zonooz
Elahe Arani
CLL
210
34
0
14 Feb 2023
Theory on Forgetting and Generalization of Continual Learning
Theory on Forgetting and Generalization of Continual LearningInternational Conference on Machine Learning (ICML), 2023
Sen Lin
Peizhong Ju
Yitao Liang
Ness B. Shroff
CLL
297
71
0
12 Feb 2023
Multipath agents for modular multitask ML systems
Multipath agents for modular multitask ML systems
Andrea Gesmundo
293
1
0
06 Feb 2023
Continual Learning with Scaled Gradient Projection
Continual Learning with Scaled Gradient ProjectionAAAI Conference on Artificial Intelligence (AAAI), 2023
Gobinda Saha
Kaushik Roy
CLL
291
41
0
02 Feb 2023
A Comprehensive Survey of Continual Learning: Theory, Method and
  Application
A Comprehensive Survey of Continual Learning: Theory, Method and ApplicationIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
Liyuan Wang
Xingxing Zhang
Hang Su
Jun Zhu
KELMCLL
948
1,250
0
31 Jan 2023
RMM: Reinforced Memory Management for Class-Incremental Learning
RMM: Reinforced Memory Management for Class-Incremental LearningNeural Information Processing Systems (NeurIPS), 2023
Yaoyao Liu
Bernt Schiele
Qianru Sun
CLL
230
125
0
14 Jan 2023
Dynamically Modular and Sparse General Continual Learning
Dynamically Modular and Sparse General Continual LearningVISIGRAPP (VISIGRAPP), 2023
Arnav Varma
Elahe Arani
Bahram Zonooz
181
1
0
02 Jan 2023
Building a Subspace of Policies for Scalable Continual Learning
Building a Subspace of Policies for Scalable Continual LearningInternational Conference on Learning Representations (ICLR), 2022
Jean-Baptiste Gaya
T. Doan
Lucas Caccia
Laure Soulier
Ludovic Denoyer
Roberta Raileanu
CLL
520
39
0
18 Nov 2022
NEVIS'22: A Stream of 100 Tasks Sampled from 30 Years of Computer Vision
  Research
NEVIS'22: A Stream of 100 Tasks Sampled from 30 Years of Computer Vision Research
J. Bornschein
Alexandre Galashov
Ross Hemsley
Amal Rannen-Triki
Yutian Chen
...
Angeliki Lazaridou
Yee Whye Teh
Andrei A. Rusu
Razvan Pascanu
MarcÁurelio Ranzato
OODVLMAI4TS
372
21
0
15 Nov 2022
Exclusive Supermask Subnetwork Training for Continual Learning
Exclusive Supermask Subnetwork Training for Continual LearningAnnual Meeting of the Association for Computational Linguistics (ACL), 2022
Prateek Yadav
Joey Tianyi Zhou
CLL
347
9
0
18 Oct 2022
Toward Sustainable Continual Learning: Detection and Knowledge
  Repurposing of Similar Tasks
Toward Sustainable Continual Learning: Detection and Knowledge Repurposing of Similar Tasks
Sijia Wang
Yoojin Choi
Junya Chen
Mostafa El-Khamy
Ricardo Henao
CLL
208
0
0
11 Oct 2022
Beyond Supervised Continual Learning: a Review
Beyond Supervised Continual Learning: a Review
Benedikt Bagus
A. Gepperth
Timothée Lesort
BDLCLL
318
14
0
30 Aug 2022
Centroids Matching: an efficient Continual Learning approach operating
  in the embedding space
Centroids Matching: an efficient Continual Learning approach operating in the embedding space
Jary Pomponi
Simone Scardapane
A. Uncini
FedMLCLL
344
1
0
03 Aug 2022
How to Reuse and Compose Knowledge for a Lifetime of Tasks: A Survey on
  Continual Learning and Functional Composition
How to Reuse and Compose Knowledge for a Lifetime of Tasks: A Survey on Continual Learning and Functional Composition
Jorge Armando Mendez Mendez
Eric Eaton
KELMCLL
403
37
0
15 Jul 2022
CompoSuite: A Compositional Reinforcement Learning Benchmark
CompoSuite: A Compositional Reinforcement Learning Benchmark
Jorge Armando Mendez Mendez
Marcel Hussing
Meghna Gummadi
Eric Eaton
CoGeOffRL
282
17
0
08 Jul 2022
SHELS: Exclusive Feature Sets for Novelty Detection and Continual
  Learning Without Class Boundaries
SHELS: Exclusive Feature Sets for Novelty Detection and Continual Learning Without Class Boundaries
Meghna Gummadi
Cassandra Kent
Jorge Armando Mendez Mendez
Eric Eaton
CLL
229
16
0
28 Jun 2022
Remember the Past: Distilling Datasets into Addressable Memories for
  Neural Networks
Remember the Past: Distilling Datasets into Addressable Memories for Neural NetworksNeural Information Processing Systems (NeurIPS), 2022
Zhiwei Deng
Olga Russakovsky
FedMLDD
398
127
0
06 Jun 2022
Thalamus: a brain-inspired algorithm for biologically-plausible
  continual learning and disentangled representations
Thalamus: a brain-inspired algorithm for biologically-plausible continual learning and disentangled representationsInternational Conference on Learning Representations (ICLR), 2022
Ali Hummos
CLL
272
12
0
24 May 2022
Continual Learning with Foundation Models: An Empirical Study of Latent
  Replay
Continual Learning with Foundation Models: An Empirical Study of Latent Replay
O. Ostapenko
Timothée Lesort
P. Rodríguez
Md Rifat Arefin
Arthur Douillard
Irina Rish
Laurent Charlin
626
74
0
30 Apr 2022
TRGP: Trust Region Gradient Projection for Continual Learning
TRGP: Trust Region Gradient Projection for Continual LearningInternational Conference on Learning Representations (ICLR), 2022
Sen Lin
Li Yang
Deliang Fan
Junshan Zhang
CLL
393
103
0
07 Feb 2022
The CLEAR Benchmark: Continual LEArning on Real-World Imagery
The CLEAR Benchmark: Continual LEArning on Real-World Imagery
Zhiqiu Lin
Jia Shi
Deepak Pathak
Deva Ramanan
CLLVLM
553
116
0
17 Jan 2022
Continual Learning of Long Topic Sequences in Neural Information
  Retrieval
Continual Learning of Long Topic Sequences in Neural Information RetrievalJoint Conference of the Information Retrieval Communities in Europe (JIRCE), 2022
Thomas Gerald
Laure Soulier
CLL
242
7
0
10 Jan 2022
Generative Kernel Continual learning
Generative Kernel Continual learning
Mohammad Mahdi Derakhshani
Xiantong Zhen
Ling Shao
Cees G. M. Snoek
BDLVLM
146
0
0
26 Dec 2021
Learning to Prompt for Continual Learning
Learning to Prompt for Continual Learning
Zifeng Wang
Zizhao Zhang
Chen-Yu Lee
Han Zhang
Ruoxi Sun
Xiaoqi Ren
Guolong Su
Vincent Perot
Jennifer Dy
Tomas Pfister
CLLVPVLMKELMVLM
481
1,205
0
16 Dec 2021
Contrastive Continual Learning with Feature Propagation
Contrastive Continual Learning with Feature Propagation
Xuejun Han
Yuhong Guo
CLL
248
4
0
03 Dec 2021
Continual Learning via Local Module Composition
Continual Learning via Local Module CompositionNeural Information Processing Systems (NeurIPS), 2021
O. Ostapenko
Pau Rodríguez López
Massimo Caccia
Laurent Charlin
KELMCLL
340
90
0
15 Nov 2021
Wide Neural Networks Forget Less Catastrophically
Wide Neural Networks Forget Less CatastrophicallyInternational Conference on Machine Learning (ICML), 2021
Seyed Iman Mirzadeh
Arslan Chaudhry
Dong Yin
Huiyi Hu
Razvan Pascanu
Dilan Görür
Mehrdad Farajtabar
CLL
347
82
0
21 Oct 2021
Avoiding Forgetting and Allowing Forward Transfer in Continual Learning
  via Sparse Networks
Avoiding Forgetting and Allowing Forward Transfer in Continual Learning via Sparse Networks
Ghada Sokar
Decebal Constantin Mocanu
Mykola Pechenizkiy
CLL
419
9
0
11 Oct 2021
GROWN: GRow Only When Necessary for Continual Learning
GROWN: GRow Only When Necessary for Continual Learning
Li Yang
Sen Lin
Junshan Zhang
Deliang Fan
CLL
301
13
0
03 Oct 2021
DualNet: Continual Learning, Fast and Slow
DualNet: Continual Learning, Fast and Slow
Quang Pham
Chenghao Liu
Guosheng Lin
CLL
342
53
0
01 Oct 2021
12
Next
Page 1 of 2