ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.05101
  4. Cited By
Regularizing with Pseudo-Negatives for Continual Self-Supervised
  Learning

Regularizing with Pseudo-Negatives for Continual Self-Supervised Learning

8 June 2023
Sungmin Cha
Kyunghyun Cho
Taesup Moon
    BDL
    CLL
ArXivPDFHTML

Papers citing "Regularizing with Pseudo-Negatives for Continual Self-Supervised Learning"

4 / 4 papers shown
Title
Hyperparameters in Continual Learning: A Reality Check
Hyperparameters in Continual Learning: A Reality Check
Sungmin Cha
Kyunghyun Cho
CLL
71
2
0
14 Mar 2024
Representational Continuity for Unsupervised Continual Learning
Representational Continuity for Unsupervised Continual Learning
Divyam Madaan
Jaehong Yoon
Yuanchun Li
Yunxin Liu
Sung Ju Hwang
CLL
SSL
66
111
0
13 Oct 2021
How Well Does Self-Supervised Pre-Training Perform with Streaming Data?
How Well Does Self-Supervised Pre-Training Perform with Streaming Data?
Dapeng Hu
Shipeng Yan
Qizhengqiu Lu
Lanqing Hong
Hailin Hu
Yifan Zhang
Zhenguo Li
Xinchao Wang
Jiashi Feng
50
28
0
25 Apr 2021
Improved Baselines with Momentum Contrastive Learning
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
267
3,371
0
09 Mar 2020
1