Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.12488
Cited By
Learn From the Past: Experience Ensemble Knowledge Distillation
25 February 2022
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Learn From the Past: Experience Ensemble Knowledge Distillation"
7 / 7 papers shown
Title
Computation-efficient Deep Learning for Computer Vision: A Survey
Yulin Wang
Yizeng Han
Chaofei Wang
Shiji Song
Qi Tian
Gao Huang
VLM
26
20
0
27 Aug 2023
Boosting Residual Networks with Group Knowledge
Shengji Tang
Peng Ye
Baopu Li
Wei Lin
Tao Chen
Tong He
Chong Yu
Wanli Ouyang
38
5
0
26 Aug 2023
A Survey of Historical Learning: Learning Models with Learning History
Xiang Li
Ge Wu
Lingfeng Yang
Wenzhe Wang
Renjie Song
Jian Yang
MU
AI4TS
23
2
0
23 Mar 2023
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
113
100
0
12 Feb 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
473
0
12 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,471
0
17 Apr 2017
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
263
10,196
0
16 Nov 2016
1