Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2008.01458
Cited By
Prime-Aware Adaptive Distillation
4 August 2020
Youcai Zhang
Zhonghao Lan
Yuchen Dai
Fangao Zeng
Yan Bai
Jie Chang
Yichen Wei
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Prime-Aware Adaptive Distillation"
26 / 26 papers shown
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Pohsun Feng
...
Yujiao Shi
Qian Niu
Cheng Fei
Keyu Chen
Ming Liu
VLM
335
4
0
18 Apr 2025
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMs
Annual Meeting of the Association for Computational Linguistics (ACL), 2025
Anshumann
Mohd Abbas Zaidi
Akhil Kedia
Jinwoo Ahn
Taehwak Kwon
Kangwook Lee
Haejun Lee
Joohyung Lee
FedML
820
1
0
21 Mar 2025
CPFD: Confidence-aware Privileged Feature Distillation for Short Video Classification
International Conference on Information and Knowledge Management (CIKM), 2024
Jinghao Shi
Xiang Shen
Kaili Zhao
Xuedong Wang
Vera Wen
Zixuan Wang
Yifan Wu
Zhixin Zhang
286
2
0
03 Oct 2024
Online Policy Distillation with Decision-Attention
IEEE International Joint Conference on Neural Network (IJCNN), 2024
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
292
1
0
08 Jun 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
341
7
0
13 Mar 2024
torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP
Yoshitomo Matsubara
VLM
248
1
0
26 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
290
4
0
26 Oct 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
557
35
0
19 Jun 2023
Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty
ACM Multimedia (ACM MM), 2023
Yuan Zhang
Weihua Chen
Yichen Lu
Tao Huang
Xiuyu Sun
Jian Cao
407
11
0
04 May 2023
Head3D: Complete 3D Head Generation via Tri-plane Feature Distillation
Y. Cheng
Manwen Liao
Wenhan Zhu
Ye Pan
Bowen Pan
Yunbo Wang
3DH
182
10
0
28 Mar 2023
Understanding the Role of the Projector in Knowledge Distillation
AAAI Conference on Artificial Intelligence (AAAI), 2023
Roy Miles
K. Mikolajczyk
299
47
0
20 Mar 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
198
2
0
15 Mar 2023
Improved Feature Distillation via Projector Ensemble
Neural Information Processing Systems (NeurIPS), 2022
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
242
55
0
27 Oct 2022
Distilling Object Detectors With Global Knowledge
European Conference on Computer Vision (ECCV), 2022
Sanli Tang
Zhongyu Zhang
Zhanzhan Cheng
Jing Lu
Yunlu Xu
Yi Niu
Fan He
252
11
0
17 Oct 2022
Dynamic Contrastive Distillation for Image-Text Retrieval
IEEE transactions on multimedia (IEEE TMM), 2022
Jun Rao
Liang Ding
Shuhan Qi
Meng Fang
Yang Liu
Liqiong Shen
Dacheng Tao
VLM
176
39
0
04 Jul 2022
Generalized Knowledge Distillation via Relationship Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2022
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
179
25
0
04 May 2022
Adaptive Instance Distillation for Object Detection in Autonomous Driving
International Conference on Pattern Recognition (ICPR), 2022
Qizhen Lan
Qing Tian
252
8
0
26 Jan 2022
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
287
55
0
31 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
354
26
0
01 Dec 2021
LGD: Label-guided Self-distillation for Object Detection
AAAI Conference on Artificial Intelligence (AAAI), 2021
Peizhen Zhang
Zijian Kang
Tong Yang
Xinming Zhang
N. Zheng
Jian Sun
ObjD
412
37
0
23 Sep 2021
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
215
29
0
19 Jun 2021
Distilling Object Detectors via Decoupled Features
Computer Vision and Pattern Recognition (CVPR), 2021
Jianyuan Guo
Kai Han
Yunhe Wang
Han Wu
Xinghao Chen
Chunjing Xu
Chang Xu
255
242
0
26 Mar 2021
torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation
International Workshop on Reproducible Research in Pattern Recognition (RRPR), 2020
Yoshitomo Matsubara
258
25
0
25 Nov 2020
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
372
55
0
03 Nov 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
1.9K
3,737
0
09 Jun 2020
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.4K
1,209
0
23 Oct 2019
1