Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.13298
Cited By
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
27 April 2021
Yixiao Ge
Xiao Zhang
Ching Lam Choi
Ka Chun Cheung
Peipei Zhao
Feng Zhu
Xiaogang Wang
Rui Zhao
Hongsheng Li
FedML
UQCV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification"
5 / 5 papers shown
Title
SnapCap: Efficient Snapshot Compressive Video Captioning
Jianqiao Sun
Yudi Su
Hao Zhang
Ziheng Cheng
Zequn Zeng
Zhengjue Wang
Bo Chen
Xin Yuan
22
1
0
10 Jan 2024
Re-labeling ImageNet: from Single to Multi-Labels, from Global to Localized Labels
Sangdoo Yun
Seong Joon Oh
Byeongho Heo
Dongyoon Han
Junsuk Choe
Sanghyuk Chun
384
139
0
13 Jan 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
244
1,279
0
06 Mar 2017
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
1