Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2303.08360
Cited By
Knowledge Distillation from Single to Multi Labels: an Empirical Study
15 March 2023
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Knowledge Distillation from Single to Multi Labels: an Empirical Study"
3 / 3 papers shown
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Pohsun Feng
...
Yujiao Shi
Qian Niu
Cheng Fei
Keyu Chen
Ming Liu
VLM
335
4
0
18 Apr 2025
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
359
0
0
25 Jun 2024
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.3K
1,209
0
23 Oct 2019
1