ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.03355
  4. Cited By
A Generalization Theory of Cross-Modality Distillation with Contrastive
  Learning

A Generalization Theory of Cross-Modality Distillation with Contrastive Learning

6 May 2024
Hangyu Lin
Chen Liu
Chengming Xu
Zhengqi Gao
Yanwei Fu
Yuan Yao
    VLM
ArXivPDFHTML

Papers citing "A Generalization Theory of Cross-Modality Distillation with Contrastive Learning"

3 / 3 papers shown
Title
On the Provable Advantage of Unsupervised Pretraining
On the Provable Advantage of Unsupervised Pretraining
Jiawei Ge
Shange Tang
Jianqing Fan
Chi Jin
SSL
22
16
0
02 Mar 2023
Cross-Modal Knowledge Transfer Without Task-Relevant Source Data
Cross-Modal Knowledge Transfer Without Task-Relevant Source Data
Sk. Miraj Ahmed
Suhas Lohit
Kuan-Chuan Peng
Michael J. Jones
A. Roy-Chowdhury
126
13
0
08 Sep 2022
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
1