Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.00811
Cited By
Review helps learn better: Temporal Supervised Knowledge Distillation
3 July 2023
Dongwei Wang
Zhi-Long Han
Yanmei Wang
Xi’ai Chen
Baicheng Liu
Yandong Tang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Review helps learn better: Temporal Supervised Knowledge Distillation"
6 / 6 papers shown
Title
Personalized Forgetting Mechanism with Concept-Driven Knowledge Tracing
Shanshan Wang
Ying Hu
Xun Yang
Zhongzhou Zhang
Keyang Wang
Xingyi Zhang
AI4Ed
19
0
0
18 Apr 2024
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
419
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
59
143
0
05 Feb 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,549
0
17 Apr 2017
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
270
10,214
0
16 Nov 2016
Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting
Xingjian Shi
Zhourong Chen
Hao Wang
Dit-Yan Yeung
W. Wong
W. Woo
203
7,902
0
13 Jun 2015
1