ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.02870
  4. Cited By
Rethinking Momentum Knowledge Distillation in Online Continual Learning

Rethinking Momentum Knowledge Distillation in Online Continual Learning

6 September 2023
Nicolas Michel
Maorong Wang
L. Xiao
T. Yamasaki
    CLL
ArXivPDFHTML

Papers citing "Rethinking Momentum Knowledge Distillation in Online Continual Learning"

3 / 3 papers shown
Title
Hyperparameters in Continual Learning: A Reality Check
Hyperparameters in Continual Learning: A Reality Check
Sungmin Cha
Kyunghyun Cho
CLL
71
2
0
14 Mar 2024
Online Continual Learning via the Meta-learning Update with Multi-scale
  Knowledge Distillation and Data Augmentation
Online Continual Learning via the Meta-learning Update with Multi-scale Knowledge Distillation and Data Augmentation
Ya-nan Han
Jian-wei Liu
KELM
CLL
33
10
0
12 Sep 2022
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
303
5,773
0
29 Apr 2021
1