Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.08937
Cited By
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
13 November 2024
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head"
1 / 1 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via
D
\mathbf{\texttt{D}}
D
ual-
H
\mathbf{\texttt{H}}
H
ead
O
\mathbf{\texttt{O}}
O
ptimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
49
0
0
12 May 2025
1