ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.19941
  4. Cited By
Diverse Feature Learning by Self-distillation and Reset

Diverse Feature Learning by Self-distillation and Reset

29 March 2024
Sejik Park
    CLL
ArXivPDFHTML

Papers citing "Diverse Feature Learning by Self-distillation and Reset"

4 / 4 papers shown
Title
Learning More Generalized Experts by Merging Experts in
  Mixture-of-Experts
Learning More Generalized Experts by Merging Experts in Mixture-of-Experts
Sejik Park
FedML
CLL
MoMe
19
5
0
19 May 2024
PLASTIC: Improving Input and Label Plasticity for Sample Efficient
  Reinforcement Learning
PLASTIC: Improving Input and Label Plasticity for Sample Efficient Reinforcement Learning
Hojoon Lee
Hanseul Cho
Hyunseung Kim
Daehoon Gwak
Joonkee Kim
Jaegul Choo
Se-Young Yun
Chulhee Yun
OffRL
73
25
0
19 Jun 2023
Training Debiased Subnetworks with Contrastive Weight Pruning
Training Debiased Subnetworks with Contrastive Weight Pruning
Geon Yeong Park
Sangmin Lee
Sang Wan Lee
Jong Chul Ye
CML
27
13
0
11 Oct 2022
Learning Fast, Learning Slow: A General Continual Learning Method based
  on Complementary Learning System
Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System
Elahe Arani
F. Sarfraz
Bahram Zonooz
CLL
60
121
0
29 Jan 2022
1