Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.03075
Cited By
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
7 September 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution"
10 / 10 papers shown
Title
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Y. Xu
31
0
0
22 Feb 2025
Online Policy Distillation with Decision-Attention
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
33
0
0
08 Jun 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
13
0
0
18 Dec 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
76
21
0
19 Jun 2023
Prototype-guided Cross-task Knowledge Distillation for Large-scale Models
Deng Li
Aming Wu
Yahong Han
Qingwen Tian
VLM
11
2
0
26 Dec 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
22
49
0
23 Jul 2022
CoupleFace: Relation Matters for Face Recognition Distillation
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
11
19
0
12 Apr 2022
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
6
25
0
19 Jun 2021
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
97
74
0
26 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
1