ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.00739
  4. Cited By
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity
  Mechanism

Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism

30 April 2024
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
ArXivPDFHTML

Papers citing "Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism"

1 / 1 papers shown
Title
Crossmodal Knowledge Distillation with WordNet-Relaxed Text Embeddings for Robust Image Classification
Crossmodal Knowledge Distillation with WordNet-Relaxed Text Embeddings for Robust Image Classification
Chenqi Guo
Mengshuo Rong
Qianli Feng
Rongfan Feng
Yinglong Ma
VLM
63
0
0
31 Mar 2025
1