Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.10163
Cited By
A Closer Look at Knowledge Distillation with Features, Logits, and Gradients
18 March 2022
Yen-Chang Hsu
James Smith
Yilin Shen
Z. Kira
Hongxia Jin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Closer Look at Knowledge Distillation with Features, Logits, and Gradients"
4 / 4 papers shown
Title
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Chaomin Shen
Faming Fang
Guixu Zhang
29
0
0
27 Sep 2024
Semantic Scene Completion with Cleaner Self
Fengyun Wang
Dong Zhang
Hanwang Zhang
Jinhui Tang
Qianru Sun
23
12
0
17 Mar 2023
Goal-Conditioned Q-Learning as Knowledge Distillation
Alexander Levine
S. Feizi
OffRL
17
2
0
28 Aug 2022
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,561
0
17 Apr 2017
1