Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09044
Cited By
Distilling Knowledge via Knowledge Review
19 April 2021
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Knowledge via Knowledge Review"
4 / 4 papers shown
Title
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Tianqing Zhang
Zixin Zhu
Kairong Yu
Hongwei Wang
28
0
0
29 Apr 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
31
0
0
27 Apr 2025
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
63
1
0
06 Jun 2024
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
927
18,450
0
17 Apr 2017
1