Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.16092
Cited By
StochCA: A Novel Approach for Exploiting Pretrained Models with Cross-Attention
25 February 2024
SeungWon Seo
Suho Lee
Sangheum Hwang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"StochCA: A Novel Approach for Exploiting Pretrained Models with Cross-Attention"
3 / 3 papers shown
Title
The effectiveness of MAE pre-pretraining for billion-scale pretraining
Mannat Singh
Quentin Duval
Kalyan Vasudev Alwala
Haoqi Fan
Vaibhav Aggarwal
...
Piotr Dollár
Christoph Feichtenhofer
Ross B. Girshick
Rohit Girdhar
Ishan Misra
LRM
102
62
0
23 Mar 2023
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
Domain-Adversarial Training of Neural Networks
Yaroslav Ganin
E. Ustinova
Hana Ajakan
Pascal Germain
Hugo Larochelle
François Laviolette
M. Marchand
Victor Lempitsky
GAN
OOD
149
9,300
0
28 May 2015
1