Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.14124
Cited By
Mixture of Experts Meets Prompt-Based Continual Learning
23 May 2024
Minh Le
An Nguyen
Huy Nguyen
Trang Nguyen
Trang Pham
L. Ngo
Nhat Ho
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Mixture of Experts Meets Prompt-Based Continual Learning"
7 / 7 papers shown
Title
A Comprehensive Survey of Mixture-of-Experts: Algorithms, Theory, and Applications
Siyuan Mu
Sen Lin
MoE
71
1
0
10 Mar 2025
Revisiting Prefix-tuning: Statistical Benefits of Reparameterization among Prompts
Minh Le
Chau Nguyen
Huy Nguyen
Quyen Tran
Trung Le
Nhat Ho
28
3
0
03 Oct 2024
On Least Square Estimation in Softmax Gating Mixture of Experts
Huy Nguyen
Nhat Ho
Alessandro Rinaldo
34
13
0
05 Feb 2024
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
283
5,723
0
29 Apr 2021
ImageNet-21K Pretraining for the Masses
T. Ridnik
Emanuel Ben-Baruch
Asaf Noy
Lihi Zelnik-Manor
SSeg
VLM
CLIP
154
676
0
22 Apr 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
275
3,784
0
18 Apr 2021
Adversarial Continual Learning
Sayna Ebrahimi
Franziska Meier
Roberto Calandra
Trevor Darrell
Marcus Rohrbach
CLL
VLM
135
195
0
21 Mar 2020
1