Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.10139
Cited By
Rethinking Knowledge Distillation via Cross-Entropy
22 August 2022
Zhendong Yang
Zhe Li
Yuan Gong
Tianke Zhang
Shanshan Lao
Chun Yuan
Yu Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Rethinking Knowledge Distillation via Cross-Entropy"
10 / 10 papers shown
Title
Towards Robust and Efficient Cloud-Edge Elastic Model Adaptation via Selective Entropy Distillation
Yaofo Chen
Shuaicheng Niu
Yaowei Wang
Shoukai Xu
Hengjie Song
Mingkui Tan
19
6
0
27 Feb 2024
Generalizable Heterogeneous Federated Cross-Correlation and Instance Similarity Learning
Wenke Huang
J. J. Valero-Mas
Dasaem Jeong
Bo Du
FedML
22
44
0
28 Sep 2023
Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection
Longrong Yang
Xianpan Zhou
Xuewei Li
Liang Qiao
Zheyang Li
Zi-Liang Yang
Gaoang Wang
Xi Li
16
4
0
28 Aug 2023
Effective Whole-body Pose Estimation with Two-stages Distillation
Zhendong Yang
Ailing Zeng
Chun Yuan
Yu Li
17
151
0
29 Jul 2023
Are Large Kernels Better Teachers than Transformers for ConvNets?
Tianjin Huang
Lu Yin
Zhenyu (Allen) Zhang
Lijuan Shen
Meng Fang
Mykola Pechenizkiy
Zhangyang Wang
Shiwei Liu
19
13
0
30 May 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
8
17
0
18 May 2023
Generic-to-Specific Distillation of Masked Autoencoders
Wei Huang
Zhiliang Peng
Li Dong
Furu Wei
Jianbin Jiao
QiXiang Ye
12
22
0
28 Feb 2023
ViTKD: Practical Guidelines for ViT feature knowledge distillation
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
81
42
0
06 Sep 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1