Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.03355
Cited By
A Generalization Theory of Cross-Modality Distillation with Contrastive Learning
6 May 2024
Hangyu Lin
Chen Liu
Chengming Xu
Zhengqi Gao
Yanwei Fu
Yuan Yao
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Generalization Theory of Cross-Modality Distillation with Contrastive Learning"
3 / 3 papers shown
Title
On the Provable Advantage of Unsupervised Pretraining
Jiawei Ge
Shange Tang
Jianqing Fan
Chi Jin
SSL
22
16
0
02 Mar 2023
Cross-Modal Knowledge Transfer Without Task-Relevant Source Data
Sk. Miraj Ahmed
Suhas Lohit
Kuan-Chuan Peng
Michael J. Jones
A. Roy-Chowdhury
126
13
0
08 Sep 2022
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
1