Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.09384
Cited By
Reflecting on the State of Rehearsal-free Continual Learning with Pretrained Models
13 June 2024
Lukas Thede
Karsten Roth
Olivier J. Hénaff
Matthias Bethge
Zeynep Akata
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Reflecting on the State of Rehearsal-free Continual Learning with Pretrained Models"
6 / 6 papers shown
Title
Understanding the Limits of Lifelong Knowledge Editing in LLMs
Lukas Thede
Karsten Roth
Matthias Bethge
Zeynep Akata
Tom Hartvigsen
KELM
CLL
64
2
0
07 Mar 2025
How to Merge Your Multimodal Models Over Time?
Sebastian Dziadzio
Vishaal Udandarao
Karsten Roth
Ameya Prabhu
Zeynep Akata
Samuel Albanie
Matthias Bethge
MoMe
84
2
0
09 Dec 2024
How green is continual learning, really? Analyzing the energy consumption in continual training of vision foundation models
Tomaso Trinci
Simone Magistri
Roberto Verdecchia
Andrew D. Bagdanov
CLL
23
0
0
27 Sep 2024
First Session Adaptation: A Strong Replay-Free Baseline for Class-Incremental Learning
A. Panos
Yuriko Kobe
Daniel Olmeda Reino
Rahaf Aljundi
Richard E. Turner
CLL
90
24
0
23 Mar 2023
AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition
Shoufa Chen
Chongjian Ge
Zhan Tong
Jiangliu Wang
Yibing Song
Jue Wang
Ping Luo
138
631
0
26 May 2022
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
275
3,784
0
18 Apr 2021
1