ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.12073
  4. Cited By
Relational Representation Distillation

Relational Representation Distillation

16 July 2024
Nikolaos Giakoumoglou
Tania Stathaki
ArXivPDFHTML

Papers citing "Relational Representation Distillation"

3 / 3 papers shown
Title
$V_kD:$ Improving Knowledge Distillation using Orthogonal Projections
VkD:V_kD:Vk​D: Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
39
9
0
10 Mar 2024
Improved Baselines with Momentum Contrastive Learning
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,359
0
09 Mar 2020
Boosting Self-Supervised Learning via Knowledge Transfer
Boosting Self-Supervised Learning via Knowledge Transfer
M. Noroozi
Ananth Vinjimoor
Paolo Favaro
Hamed Pirsiavash
SSL
207
291
0
01 May 2018
1