ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.06235
  4. Cited By
Improving Self-Supervised Learning by Characterizing Idealized
  Representations

Improving Self-Supervised Learning by Characterizing Idealized Representations

13 September 2022
Yann Dubois
Tatsunori Hashimoto
Stefano Ermon
Percy Liang
    SSL
ArXivPDFHTML

Papers citing "Improving Self-Supervised Learning by Characterizing Idealized Representations"

19 / 19 papers shown
Title
Locality Alignment Improves Vision-Language Models
Locality Alignment Improves Vision-Language Models
Ian Covert
Tony Sun
James Y. Zou
Tatsunori Hashimoto
VLM
58
3
0
14 Oct 2024
InfoNCE: Identifying the Gap Between Theory and Practice
InfoNCE: Identifying the Gap Between Theory and Practice
E. Rusak
Patrik Reizinger
Attila Juhos
Oliver Bringmann
Roland S. Zimmermann
Wieland Brendel
33
5
0
28 Jun 2024
LDReg: Local Dimensionality Regularized Self-Supervised Learning
LDReg: Local Dimensionality Regularized Self-Supervised Learning
Hanxun Huang
R. Campello
S. Erfani
Xingjun Ma
Michael E. Houle
James Bailey
24
5
0
19 Jan 2024
No Representation Rules Them All in Category Discovery
No Representation Rules Them All in Category Discovery
S. Vaze
Andrea Vedaldi
Andrew Zisserman
OOD
18
31
0
28 Nov 2023
Quantifying the Variability Collapse of Neural Networks
Quantifying the Variability Collapse of Neural Networks
Jing-Xue Xu
Haoxiong Liu
23
4
0
06 Jun 2023
A surprisingly simple technique to control the pretraining bias for
  better transfer: Expand or Narrow your representation
A surprisingly simple technique to control the pretraining bias for better transfer: Expand or Narrow your representation
Florian Bordes
Samuel Lavoie
Randall Balestriero
Nicolas Ballas
Pascal Vincent
SSL
11
5
0
11 Apr 2023
Unsupervised Learning on a DIET: Datum IndEx as Target Free of
  Self-Supervision, Reconstruction, Projector Head
Unsupervised Learning on a DIET: Datum IndEx as Target Free of Self-Supervision, Reconstruction, Projector Head
Randall Balestriero
22
3
0
20 Feb 2023
GEDI: GEnerative and DIscriminative Training for Self-Supervised
  Learning
GEDI: GEnerative and DIscriminative Training for Self-Supervised Learning
Emanuele Sansone
Robin Manhaeve
SSL
15
9
0
27 Dec 2022
SynBench: Task-Agnostic Benchmarking of Pretrained Representations using
  Synthetic Data
SynBench: Task-Agnostic Benchmarking of Pretrained Representations using Synthetic Data
Ching-Yun Ko
Pin-Yu Chen
Jeet Mohapatra
Payel Das
Lucani E. Daniel
19
3
0
06 Oct 2022
An Unconstrained Layer-Peeled Perspective on Neural Collapse
An Unconstrained Layer-Peeled Perspective on Neural Collapse
Wenlong Ji
Yiping Lu
Yiliang Zhang
Zhun Deng
Weijie J. Su
114
65
0
06 Oct 2021
On Feature Decorrelation in Self-Supervised Learning
On Feature Decorrelation in Self-Supervised Learning
Tianyu Hua
Wenxiao Wang
Zihui Xue
Sucheng Ren
Yue Wang
Hang Zhao
SSL
OOD
107
163
0
02 May 2021
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
283
5,723
0
29 Apr 2021
Dissecting Supervised Contrastive Learning
Dissecting Supervised Contrastive Learning
Florian Graf
Christoph Hofer
Marc Niethammer
Roland Kwitt
SSL
107
68
0
17 Feb 2021
Understanding self-supervised Learning Dynamics without Contrastive
  Pairs
Understanding self-supervised Learning Dynamics without Contrastive Pairs
Yuandong Tian
Xinlei Chen
Surya Ganguli
SSL
130
258
0
12 Feb 2021
Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse
  in Imbalanced Training
Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced Training
Cong Fang
Hangfeng He
Qi Long
Weijie J. Su
FAtt
112
162
0
29 Jan 2021
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
BYOL works even without batch statistics
BYOL works even without batch statistics
Pierre Harvey Richemond
Jean-Bastien Grill
Florent Altché
Corentin Tallec
Florian Strub
...
Samuel L. Smith
Soham De
Razvan Pascanu
Bilal Piot
Michal Valko
SSL
242
114
0
20 Oct 2020
Improving Transformation Invariance in Contrastive Representation
  Learning
Improving Transformation Invariance in Contrastive Representation Learning
Adam Foster
Rattana Pukdee
Tom Rainforth
48
22
0
19 Oct 2020
For self-supervised learning, Rationality implies generalization,
  provably
For self-supervised learning, Rationality implies generalization, provably
Yamini Bansal
Gal Kaplun
Boaz Barak
OOD
SSL
50
22
0
16 Oct 2020
1