ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.16409
  4. Cited By
Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning

Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning

27 December 2023
Yan Fan
Yu Wang
Pengfei Zhu
Qinghua Hu
    CLL
ArXivPDFHTML

Papers citing "Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning"

3 / 3 papers shown
Title
Reducing Class-wise Confusion for Incremental Learning with Disentangled Manifolds
Reducing Class-wise Confusion for Incremental Learning with Disentangled Manifolds
Huitong Chen
Yu Wang
Yan Fan
Guosong Jiang
Q. Hu
CLL
44
0
0
22 Mar 2025
Reshaping the Online Data Buffering and Organizing Mechanism for
  Continual Test-Time Adaptation
Reshaping the Online Data Buffering and Organizing Mechanism for Continual Test-Time Adaptation
Zhilin Zhu
Xiaopeng Hong
Zhiheng Ma
Weijun Zhuang
Yaohui Ma
Yong Dai
Yaowei Wang
CLL
TTA
22
1
0
12 Jul 2024
Memory-Efficient Semi-Supervised Continual Learning: The World is its
  Own Replay Buffer
Memory-Efficient Semi-Supervised Continual Learning: The World is its Own Replay Buffer
James Smith
Jonathan C. Balloch
Yen-Chang Hsu
Z. Kira
CLL
118
35
0
23 Jan 2021
1