ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.07514
  4. Cited By
CNN-JEPA: Self-Supervised Pretraining Convolutional Neural Networks Using Joint Embedding Predictive Architecture

CNN-JEPA: Self-Supervised Pretraining Convolutional Neural Networks Using Joint Embedding Predictive Architecture

14 August 2024
András Kalapos
Bálint Gyires-Tóth
ArXivPDFHTML

Papers citing "CNN-JEPA: Self-Supervised Pretraining Convolutional Neural Networks Using Joint Embedding Predictive Architecture"

4 / 4 papers shown
Title
Pretraining Generative Flow Networks with Inexpensive Rewards for Molecular Graph Generation
Pretraining Generative Flow Networks with Inexpensive Rewards for Molecular Graph Generation
Mohit Pandey
G. Subbaraj
Artem Cherkasov
Martin Ester
Emmanuel Bengio
AI4CE
63
1
0
08 Mar 2025
GFlowNet Pretraining with Inexpensive Rewards
GFlowNet Pretraining with Inexpensive Rewards
Mohit Pandey
G. Subbaraj
Emmanuel Bengio
AI4CE
25
3
0
15 Sep 2024
Masked Autoencoders Are Scalable Vision Learners
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,412
0
11 Nov 2021
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
295
5,761
0
29 Apr 2021
1