ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.09586
  4. Cited By
WERank: Towards Rank Degradation Prevention for Self-Supervised Learning
  Using Weight Regularization

WERank: Towards Rank Degradation Prevention for Self-Supervised Learning Using Weight Regularization

14 February 2024
Ali Saheb Pasand
Reza Moravej
Mahdi Biparva
Ali Ghodsi
ArXivPDFHTML

Papers citing "WERank: Towards Rank Degradation Prevention for Self-Supervised Learning Using Weight Regularization"

4 / 4 papers shown
Title
RankMe: Assessing the downstream performance of pretrained
  self-supervised representations by their rank
RankMe: Assessing the downstream performance of pretrained self-supervised representations by their rank
Q. Garrido
Randall Balestriero
Laurent Najman
Yann LeCun
SSL
43
71
0
05 Oct 2022
Augmentation-Free Self-Supervised Learning on Graphs
Augmentation-Free Self-Supervised Learning on Graphs
Namkyeong Lee
Junseok Lee
Chanyoung Park
60
203
0
05 Dec 2021
On Feature Decorrelation in Self-Supervised Learning
On Feature Decorrelation in Self-Supervised Learning
Tianyu Hua
Wenxiao Wang
Zihui Xue
Sucheng Ren
Yue Wang
Hang Zhao
SSL
OOD
109
186
0
02 May 2021
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
228
31,150
0
16 Jan 2013
1