ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.03075
  4. Cited By
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
v1v2 (latest)

Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution

IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
7 September 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
ArXiv (abs)PDFHTMLGithub (75★)

Papers citing "Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution"

13 / 13 papers shown
Make me an Expert: Distilling from Generalist Black-Box Models into Specialized Models for Semantic Segmentation
Make me an Expert: Distilling from Generalist Black-Box Models into Specialized Models for Semantic Segmentation
Yasser Benigmim
Subhankar Roy
Khalid Oublal
Imad Eddine Marouf
S. Essid
Vicky Kalogeiton
Stéphane Lathuilière
196
0
0
30 Aug 2025
A Layered Self-Supervised Knowledge Distillation Framework for Efficient Multimodal Learning on the Edge
A Layered Self-Supervised Knowledge Distillation Framework for Efficient Multimodal Learning on the Edge
Tarique Dahri
Zulfiqar Ali Memon
Zhenyu Yu
Mohd Yamani Idna Idris
Sheheryar Khan
Sadiq Ahmad
Maged Shoman
Saddam Aziz
Rizwan Qureshi
261
0
0
08 Jun 2025
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual RecognitionAAAI Conference on Artificial Intelligence (AAAI), 2025
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Yongjun Xu
353
21
0
22 Feb 2025
Online Policy Distillation with Decision-Attention
Online Policy Distillation with Decision-AttentionIEEE International Joint Conference on Neural Network (IJCNN), 2024
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
416
2
0
08 Jun 2024
Sinkhorn Distance Minimization for Knowledge Distillation
Sinkhorn Distance Minimization for Knowledge Distillation
Xiao Cui
Yulei Qin
Yuting Gao
Enwei Zhang
Zihan Xu
Tong Wu
Ke Li
Xing Sun
Wen-gang Zhou
Houqiang Li
241
25
0
27 Feb 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
290
1
0
18 Dec 2023
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLMOffRL
613
39
0
19 Jun 2023
Prototype-guided Cross-task Knowledge Distillation for Large-scale
  Models
Prototype-guided Cross-task Knowledge Distillation for Large-scale Models
Deng Li
Aming Wu
Yahong Han
Qingwen Tian
VLM
284
5
0
26 Dec 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual
  Recognition
Online Knowledge Distillation via Mutual Contrastive Learning for Visual RecognitionIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2022
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
391
81
0
23 Jul 2022
CoupleFace: Relation Matters for Face Recognition Distillation
CoupleFace: Relation Matters for Face Recognition DistillationEuropean Conference on Computer Vision (ECCV), 2022
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
391
22
0
12 Apr 2022
Teacher's pet: understanding and mitigating biases in distillation
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
301
29
0
19 Jun 2021
Mutual Contrastive Learning for Visual Representation Learning
Mutual Contrastive Learning for Visual Representation LearningAAAI Conference on Artificial Intelligence (AAAI), 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLMSSL
513
93
0
26 Apr 2021
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.6K
1,280
0
23 Oct 2019
1
Page 1 of 1