ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.01458
  4. Cited By
Prime-Aware Adaptive Distillation

Prime-Aware Adaptive Distillation

4 August 2020
Youcai Zhang
Zhonghao Lan
Yuchen Dai
Fangao Zeng
Yan Bai
Jie Chang
Yichen Wei
ArXiv (abs)PDFHTML

Papers citing "Prime-Aware Adaptive Distillation"

26 / 26 papers shown
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Pohsun Feng
...
Yujiao Shi
Qian Niu
Cheng Fei
Keyu Chen
Ming Liu
VLM
335
4
0
18 Apr 2025
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMs
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMsAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Anshumann
Mohd Abbas Zaidi
Akhil Kedia
Jinwoo Ahn
Taehwak Kwon
Kangwook Lee
Haejun Lee
Joohyung Lee
FedML
820
1
0
21 Mar 2025
CPFD: Confidence-aware Privileged Feature Distillation for Short Video
  Classification
CPFD: Confidence-aware Privileged Feature Distillation for Short Video ClassificationInternational Conference on Information and Knowledge Management (CIKM), 2024
Jinghao Shi
Xiang Shen
Kaili Zhao
Xuedong Wang
Vera Wen
Zixuan Wang
Yifan Wu
Zhixin Zhang
286
2
0
03 Oct 2024
Online Policy Distillation with Decision-Attention
Online Policy Distillation with Decision-AttentionIEEE International Joint Conference on Neural Network (IJCNN), 2024
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
292
1
0
08 Jun 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
341
7
0
13 Mar 2024
torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free
  Deep Learning Studies: A Case Study on NLP
torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP
Yoshitomo Matsubara
VLM
248
1
0
26 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
290
4
0
26 Oct 2023
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLMOffRL
557
35
0
19 Jun 2023
Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with
  Uncertainty
Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with UncertaintyACM Multimedia (ACM MM), 2023
Yuan Zhang
Weihua Chen
Yichen Lu
Tao Huang
Xiuyu Sun
Jian Cao
407
11
0
04 May 2023
Head3D: Complete 3D Head Generation via Tri-plane Feature Distillation
Head3D: Complete 3D Head Generation via Tri-plane Feature Distillation
Y. Cheng
Manwen Liao
Wenhan Zhu
Ye Pan
Bowen Pan
Yunbo Wang
3DH
182
10
0
28 Mar 2023
Understanding the Role of the Projector in Knowledge Distillation
Understanding the Role of the Projector in Knowledge DistillationAAAI Conference on Artificial Intelligence (AAAI), 2023
Roy Miles
K. Mikolajczyk
299
47
0
20 Mar 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
198
2
0
15 Mar 2023
Improved Feature Distillation via Projector Ensemble
Improved Feature Distillation via Projector EnsembleNeural Information Processing Systems (NeurIPS), 2022
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
242
55
0
27 Oct 2022
Distilling Object Detectors With Global Knowledge
Distilling Object Detectors With Global KnowledgeEuropean Conference on Computer Vision (ECCV), 2022
Sanli Tang
Zhongyu Zhang
Zhanzhan Cheng
Jing Lu
Yunlu Xu
Yi Niu
Fan He
252
11
0
17 Oct 2022
Dynamic Contrastive Distillation for Image-Text Retrieval
Dynamic Contrastive Distillation for Image-Text RetrievalIEEE transactions on multimedia (IEEE TMM), 2022
Jun Rao
Liang Ding
Shuhan Qi
Meng Fang
Yang Liu
Liqiong Shen
Dacheng Tao
VLM
176
39
0
04 Jul 2022
Generalized Knowledge Distillation via Relationship Matching
Generalized Knowledge Distillation via Relationship MatchingIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2022
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
179
25
0
04 May 2022
Adaptive Instance Distillation for Object Detection in Autonomous
  Driving
Adaptive Instance Distillation for Object Detection in Autonomous DrivingInternational Conference on Pattern Recognition (ICPR), 2022
Qizhen Lan
Qing Tian
252
8
0
26 Jan 2022
Data-Free Knowledge Transfer: A Survey
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
287
55
0
31 Dec 2021
Information Theoretic Representation Distillation
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
354
26
0
01 Dec 2021
LGD: Label-guided Self-distillation for Object Detection
LGD: Label-guided Self-distillation for Object DetectionAAAI Conference on Artificial Intelligence (AAAI), 2021
Peizhen Zhang
Zijian Kang
Tong Yang
Xinming Zhang
N. Zheng
Jian Sun
ObjD
412
37
0
23 Sep 2021
Teacher's pet: understanding and mitigating biases in distillation
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
215
29
0
19 Jun 2021
Distilling Object Detectors via Decoupled Features
Distilling Object Detectors via Decoupled FeaturesComputer Vision and Pattern Recognition (CVPR), 2021
Jianyuan Guo
Kai Han
Yunhe Wang
Han Wu
Xinghao Chen
Chunjing Xu
Chang Xu
255
242
0
26 Mar 2021
torchdistill: A Modular, Configuration-Driven Framework for Knowledge
  Distillation
torchdistill: A Modular, Configuration-Driven Framework for Knowledge DistillationInternational Workshop on Reproducible Research in Pattern Recognition (RRPR), 2020
Yoshitomo Matsubara
258
25
0
25 Nov 2020
Distilling Knowledge by Mimicking Features
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
372
55
0
03 Nov 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
1.9K
3,737
0
09 Jun 2020
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.4K
1,209
0
23 Oct 2019
1