ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12787
  4. Cited By
Respecting Transfer Gap in Knowledge Distillation

Respecting Transfer Gap in Knowledge Distillation

23 October 2022
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
ArXivPDFHTML

Papers citing "Respecting Transfer Gap in Knowledge Distillation"

15 / 15 papers shown
Title
Revisiting the Relationship between Adversarial and Clean Training: Why Clean Training Can Make Adversarial Training Better
Revisiting the Relationship between Adversarial and Clean Training: Why Clean Training Can Make Adversarial Training Better
MingWei Zhou
Xiaobing Pei
AAML
41
0
0
30 Mar 2025
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
39
0
0
09 Mar 2025
Cross-Domain Knowledge Distillation for Low-Resolution Human Pose
  Estimation
Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation
Zejun Gu
Zhongming Zhao
Henghui Ding
Hao Shen
Zhao Zhang
De-Shuang Huang
27
0
0
19 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
44
1
0
22 Apr 2024
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge
  Distillation
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation
Zihao Tang
Zheqi Lv
Shengyu Zhang
Yifan Zhou
Xinyu Duan
Fei Wu
Kun Kuang
24
1
0
11 Mar 2024
$V_kD:$ Improving Knowledge Distillation using Orthogonal Projections
VkD:V_kD:Vk​D: Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
34
9
0
10 Mar 2024
Domain Invariant Learning for Gaussian Processes and Bayesian
  Exploration
Domain Invariant Learning for Gaussian Processes and Bayesian Exploration
Xilong Zhao
Siyuan Bian
Yaoyun Zhang
Yuliang Zhang
Qinying Gu
Xinbing Wang
Cheng Zhou
Nanyang Ye
18
1
0
18 Dec 2023
Understanding the Effects of Projectors in Knowledge Distillation
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
11
0
0
26 Oct 2023
NormKD: Normalized Logits for Knowledge Distillation
NormKD: Normalized Logits for Knowledge Distillation
Zhihao Chi
Tu Zheng
Hengjia Li
Zheng Yang
Boxi Wu
Binbin Lin
D. Cai
11
13
0
01 Aug 2023
Towards Effective Collaborative Learning in Long-Tailed Recognition
Towards Effective Collaborative Learning in Long-Tailed Recognition
Zhengzhuo Xu
Zenghao Chai
Chengying Xu
Chun Yuan
Haiqin Yang
8
4
0
05 May 2023
DiGeo: Discriminative Geometry-Aware Learning for Generalized Few-Shot
  Object Detection
DiGeo: Discriminative Geometry-Aware Learning for Generalized Few-Shot Object Detection
Jiawei Ma
Yulei Niu
Jincheng Xu
Shiyuan Huang
G. Han
Shih-Fu Chang
ObjD
11
36
0
16 Mar 2023
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
308
0
19 Apr 2021
Distilling Causal Effect of Data in Class-Incremental Learning
Distilling Causal Effect of Data in Class-Incremental Learning
Xinting Hu
Kaihua Tang
C. Miao
Xiansheng Hua
Hanwang Zhang
CML
163
174
0
02 Mar 2021
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
244
35,884
0
25 Aug 2016
1