Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.00739
Cited By
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism
30 April 2024
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism"
1 / 1 papers shown
Title
Crossmodal Knowledge Distillation with WordNet-Relaxed Text Embeddings for Robust Image Classification
Chenqi Guo
Mengshuo Rong
Qianli Feng
Rongfan Feng
Yinglong Ma
VLM
63
0
0
31 Mar 2025
1