Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.05409
Cited By
Knowledge Condensation Distillation
12 July 2022
Chenxin Li
Mingbao Lin
Zhiyuan Ding
Nie Lin
Yihong Zhuang
Yue Huang
Xinghao Ding
Liujuan Cao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Condensation Distillation"
7 / 7 papers shown
Title
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
28
2
0
27 Jun 2024
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
93
1
0
06 Jun 2024
Dataset Distillation: A Comprehensive Review
Ruonan Yu
Songhua Liu
Xinchao Wang
DD
53
121
0
17 Jan 2023
Hint-dynamic Knowledge Distillation
Yiyang Liu
Chenxin Li
Xiaotong Tu
Xinghao Ding
Yue Huang
14
1
0
30 Nov 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,567
0
17 Apr 2017
1