ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.13715
  4. Cited By
Hierarchical Self-supervised Augmented Knowledge Distillation

Hierarchical Self-supervised Augmented Knowledge Distillation

29 July 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
    SSL
ArXivPDFHTML

Papers citing "Hierarchical Self-supervised Augmented Knowledge Distillation"

36 / 36 papers shown
Title
MoKD: Multi-Task Optimization for Knowledge Distillation
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
37
0
0
13 May 2025
HierSum: A Global and Local Attention Mechanism for Video Summarization
HierSum: A Global and Local Attention Mechanism for Video Summarization
Apoorva Beedu
Irfan Essa
44
0
0
25 Apr 2025
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Y. Xu
31
0
0
22 Feb 2025
Cross-View Consistency Regularisation for Knowledge Distillation
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
14
0
0
18 Oct 2024
Prototype-Driven Multi-Feature Generation for Visible-Infrared Person
  Re-identification
Prototype-Driven Multi-Feature Generation for Visible-Infrared Person Re-identification
Jiarui Li
Zhen Qiu
Yilin Yang
Yuqi Li
Zeyu Dong
Chuanguang Yang
32
0
0
09 Sep 2024
How to Train the Teacher Model for Effective Knowledge Distillation
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi
Xizhen Deng
Renhao Tan
Linfeng Ye
Ahmed H. Salamah
19
2
0
25 Jul 2024
Online Policy Distillation with Decision-Attention
Online Policy Distillation with Decision-Attention
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
39
0
0
08 Jun 2024
A Comprehensive Review of Knowledge Distillation in Computer Vision
A Comprehensive Review of Knowledge Distillation in Computer Vision
Sheikh Musa Kaleem
Tufail Rouf
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
VLM
25
13
0
01 Apr 2024
Precise Knowledge Transfer via Flow Matching
Precise Knowledge Transfer via Flow Matching
Shitong Shao
Zhiqiang Shen
Linrui Gong
Huanran Chen
Xu Dai
21
2
0
03 Feb 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation
  Based on Conditional Mutual Information
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
27
12
0
16 Jan 2024
Comparative Knowledge Distillation
Comparative Knowledge Distillation
Alex Wilf
Alex Tianyi Xu
Paul Pu Liang
A. Obolenskiy
Daniel Fried
Louis-Philippe Morency
VLM
14
1
0
03 Nov 2023
Heterogeneous Generative Knowledge Distillation with Masked Image
  Modeling
Heterogeneous Generative Knowledge Distillation with Masked Image Modeling
Ziming Wang
Shumin Han
Xiaodi Wang
Jing Hao
Xianbin Cao
Baochang Zhang
VLM
13
0
0
18 Sep 2023
CLIP-KD: An Empirical Study of CLIP Model Distillation
CLIP-KD: An Empirical Study of CLIP Model Distillation
Chuanguang Yang
Zhulin An
Libo Huang
Junyu Bi
Xinqiang Yu
Hansheng Yang
Boyu Diao
Yongjun Xu
VLM
21
27
0
24 Jul 2023
Customizing Synthetic Data for Data-Free Student Learning
Customizing Synthetic Data for Data-Free Student Learning
Shiya Luo
Defang Chen
Can Wang
12
2
0
10 Jul 2023
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
21
0
19 Jun 2023
Team AcieLee: Technical Report for EPIC-SOUNDS Audio-Based Interaction
  Recognition Challenge 2023
Team AcieLee: Technical Report for EPIC-SOUNDS Audio-Based Interaction Recognition Challenge 2023
Yuqi Li
Yi-Jhen Luo
Xiaoshuai Hao
Chuanguang Yang
Zhulin An
Dantong Song
Wei Yi
25
0
0
15 Jun 2023
Catch-Up Distillation: You Only Need to Train Once for Accelerating
  Sampling
Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling
Shitong Shao
Xu Dai
Shouyi Yin
Lujun Li
Huanran Chen
Yang Hu
20
17
0
18 May 2023
eTag: Class-Incremental Learning with Embedding Distillation and
  Task-Oriented Generation
eTag: Class-Incremental Learning with Embedding Distillation and Task-Oriented Generation
Libo Huang
Yan Zeng
Chuanguang Yang
Zhulin An
Boyu Diao
Yongjun Xu
CLL
12
2
0
20 Apr 2023
Towards Understanding the Effect of Pretraining Label Granularity
Towards Understanding the Effect of Pretraining Label Granularity
Guanzhe Hong
Yin Cui
Ariel Fuxman
Stanley H. Chan
Enming Luo
19
2
0
29 Mar 2023
Understanding the Role of the Projector in Knowledge Distillation
Understanding the Role of the Projector in Knowledge Distillation
Roy Miles
K. Mikolajczyk
14
21
0
20 Mar 2023
Ladder Siamese Network: a Method and Insights for Multi-level
  Self-Supervised Learning
Ladder Siamese Network: a Method and Insights for Multi-level Self-Supervised Learning
Ryota Yoshihashi
Shuhei Nishimura
Dai Yonebayashi
Yuya Otsuka
Tomohiro Tanaka
Takashi Miyazaki
SSL
19
2
0
25 Nov 2022
Online Cross-Layer Knowledge Distillation on Graph Neural Networks with
  Deep Supervision
Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Jiongyu Guo
Defang Chen
Can Wang
11
3
0
25 Oct 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual
  Recognition
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
22
50
0
23 Jul 2022
HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors
HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors
Luting Wang
Xiaojie Li
Yue Liao
Jiang
Jianlong Wu
Fei-Yue Wang
Chao Qian
Si Liu
17
20
0
12 Jul 2022
Localizing Semantic Patches for Accelerating Image Classification
Localizing Semantic Patches for Accelerating Image Classification
Chuanguang Yang
Zhulin An
Yongjun Xu
SSeg
6
2
0
07 Jun 2022
Cross-Domain Correlation Distillation for Unsupervised Domain Adaptation
  in Nighttime Semantic Segmentation
Cross-Domain Correlation Distillation for Unsupervised Domain Adaptation in Nighttime Semantic Segmentation
Huan-ang Gao
Jichang Guo
Guoli Wang
Qian Zhang
10
63
0
02 May 2022
Proto2Proto: Can you recognize the car, the way I do?
Proto2Proto: Can you recognize the car, the way I do?
Monish Keswani
Sriranjani Ramakrishnan
Nishant Reddy
V. Balasubramanian
6
26
0
25 Apr 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
16
169
0
14 Apr 2022
CoupleFace: Relation Matters for Face Recognition Distillation
CoupleFace: Relation Matters for Face Recognition Distillation
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
16
19
0
12 Apr 2022
Information Theoretic Representation Distillation
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
8
21
0
01 Dec 2021
Optimizing for In-memory Deep Learning with Emerging Memory Technology
Optimizing for In-memory Deep Learning with Emerging Memory Technology
Zhehui Wang
Tao Luo
Rick Siow Mong Goh
Wei Zhang
Weng-Fai Wong
8
1
0
01 Dec 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Multi-View Correlation Distillation for Incremental Object Detection
Multi-View Correlation Distillation for Incremental Object Detection
Dongbao Yang
Yu Zhou
Weiping Wang
22
59
0
05 Jul 2021
Teacher's pet: understanding and mitigating biases in distillation
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
6
25
0
19 Jun 2021
Mutual Contrastive Learning for Visual Representation Learning
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
97
74
0
26 Apr 2021
1