ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.08937
  4. Cited By
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an
  Auxiliary Head

Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head

13 November 2024
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
ArXivPDFHTML

Papers citing "Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head"

1 / 1 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via $\mathbf{\texttt{D}}$ual-$\mathbf{\texttt{H}}$ead $\mathbf{\texttt{O}}$ptimization
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via D\mathbf{\texttt{D}}Dual-H\mathbf{\texttt{H}}Head O\mathbf{\texttt{O}}Optimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
49
0
0
12 May 2025
1