Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2210.04428
Cited By
A Simple Baseline that Questions the Use of Pretrained-Models in Continual Learning
10 October 2022
Paul Janson
Wenxuan Zhang
Rahaf Aljundi
Mohamed Elhoseiny
VLM
SSL
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Simple Baseline that Questions the Use of Pretrained-Models in Continual Learning"
8 / 8 papers shown
Title
FeNeC: Enhancing Continual Learning via Feature Clustering with Neighbor- or Logit-Based Classification
Kamil Książek
Hubert Jastrzębski
Bartosz Trojan
Krzysztof Pniaczek
Michał Karp
Jacek Tabor
CLL
76
0
0
18 Mar 2025
Capture Global Feature Statistics for One-Shot Federated Learning
Zenghao Guan
Yucan Zhou
Xiaoyan Gu
FedML
63
0
0
10 Mar 2025
Future-Proofing Class-Incremental Learning
Quentin Jodelet
Xin Liu
Yin Jun Phua
Tsuyoshi Murata
VLM
36
2
0
04 Apr 2024
Read Between the Layers: Leveraging Multi-Layer Representations for Rehearsal-Free Continual Learning with Pre-Trained Models
Kyra Ahrens
Hans Hergen Lehmann
Jae Hee Lee
Stefan Wermter
CLL
35
7
0
13 Dec 2023
Rethinking Class-incremental Learning in the Era of Large Pre-trained Models via Test-Time Adaptation
Imad Eddine Marouf
Subhankar Roy
Enzo Tartaglione
Stéphane Lathuilière
CLL
16
3
0
17 Oct 2023
ImageNet-21K Pretraining for the Masses
T. Ridnik
Emanuel Ben-Baruch
Asaf Noy
Lihi Zelnik-Manor
SSeg
VLM
CLIP
179
686
0
22 Apr 2021
Adversarial Continual Learning
Sayna Ebrahimi
Franziska Meier
Roberto Calandra
Trevor Darrell
Marcus Rohrbach
CLL
VLM
152
198
0
21 Mar 2020
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,194
0
01 Sep 2014
1