ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.10081
  4. Cited By
What Do We Maximize in Self-Supervised Learning?

What Do We Maximize in Self-Supervised Learning?

20 July 2022
Ravid Shwartz-Ziv
Randall Balestriero
Yann LeCun
    SSL
ArXivPDFHTML

Papers citing "What Do We Maximize in Self-Supervised Learning?"

4 / 4 papers shown
Title
Reverse Engineering Self-Supervised Learning
Reverse Engineering Self-Supervised Learning
Ido Ben-Shaul
Ravid Shwartz-Ziv
Tomer Galanti
S. Dekel
Yann LeCun
SSL
13
34
0
24 May 2023
Minimalistic Unsupervised Learning with the Sparse Manifold Transform
Minimalistic Unsupervised Learning with the Sparse Manifold Transform
Yubei Chen
Zeyu Yun
Y. Ma
Bruno A. Olshausen
Yann LeCun
39
8
0
30 Sep 2022
Compressive Visual Representations
Compressive Visual Representations
Kuang-Huei Lee
Anurag Arnab
S. Guadarrama
John F. Canny
Ian S. Fischer
SSL
57
48
0
27 Sep 2021
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
305
5,773
0
29 Apr 2021
1