ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.13264
  4. Cited By
Distilling Knowledge from Self-Supervised Teacher by Embedding Graph
  Alignment

Distilling Knowledge from Self-Supervised Teacher by Embedding Graph Alignment

23 November 2022
Yuchen Ma
Yanbei Chen
Zeynep Akata
ArXivPDFHTML

Papers citing "Distilling Knowledge from Self-Supervised Teacher by Embedding Graph Alignment"

4 / 4 papers shown
Title
Relational Representation Distillation
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
32
0
0
16 Jul 2024
Relation Modeling and Distillation for Learning with Noisy Labels
Relation Modeling and Distillation for Learning with Noisy Labels
Xiaming Chen
Junlin Zhang
Zhuang Qi
Xin Qi
NoLa
29
0
0
30 May 2024
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Yanbei Chen
Yongqin Xian
A. Sophia Koepke
Ying Shan
Zeynep Akata
80
80
0
22 Apr 2021
Scaling Up Visual and Vision-Language Representation Learning With Noisy
  Text Supervision
Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision
Chao Jia
Yinfei Yang
Ye Xia
Yi-Ting Chen
Zarana Parekh
Hieu H. Pham
Quoc V. Le
Yun-hsuan Sung
Zhen Li
Tom Duerig
VLM
CLIP
298
3,693
0
11 Feb 2021
1