Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.11923
Cited By
Improving Knowledge Distillation via Transferring Learning Ability
24 April 2023
Long Liu
Tong Li
Hui Cheng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Knowledge Distillation via Transferring Learning Ability"
3 / 3 papers shown
Title
Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher
Mehdi Rezagholizadeh
A. Jafari
Puneeth Salad
Pranav Sharma
Ali Saheb Pasand
A. Ghodsi
71
17
0
16 Oct 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
419
0
19 Apr 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,549
0
17 Apr 2017
1