Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.05229
Cited By
HiDe-PET: Continual Learning via Hierarchical Decomposition of Parameter-Efficient Tuning
7 July 2024
Liyuan Wang
Jingyi Xie
Xingxing Zhang
Hang Su
Jun Zhu
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"HiDe-PET: Continual Learning via Hierarchical Decomposition of Parameter-Efficient Tuning"
5 / 5 papers shown
Title
SLCA++: Unleash the Power of Sequential Fine-tuning for Continual Learning with Pre-training
Gengwei Zhang
Liyuan Wang
Guoliang Kang
Ling Chen
Yunchao Wei
VLM
CLL
31
2
0
15 Aug 2024
First Session Adaptation: A Strong Replay-Free Baseline for Class-Incremental Learning
A. Panos
Yuriko Kobe
Daniel Olmeda Reino
Rahaf Aljundi
Richard E. Turner
CLL
93
42
0
23 Mar 2023
Generalized Out-of-Distribution Detection: A Survey
Jingkang Yang
Kaiyang Zhou
Yixuan Li
Ziwei Liu
171
870
0
21 Oct 2021
ImageNet-21K Pretraining for the Masses
T. Ridnik
Emanuel Ben-Baruch
Asaf Noy
Lihi Zelnik-Manor
SSeg
VLM
CLIP
154
676
0
22 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
278
3,784
0
18 Apr 2021
1