Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.16231
Cited By
Curriculum Temperature for Knowledge Distillation
29 November 2022
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Curriculum Temperature for Knowledge Distillation"
5 / 55 papers shown
Title
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Junzhuo Li
Xinwei Wu
Weilong Dong
Shuangzhi Wu
Chao Bian
Deyi Xiong
21
3
0
16 Dec 2022
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
75
41
0
29 Jun 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Curriculum DeepSDF
Yueqi Duan
Haidong Zhu
He-Nan Wang
L. Yi
Ram Nevatia
Leonidas J. Guibas
60
93
0
19 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
Previous
1
2