Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2206.14532
Cited By
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
29 June 2022
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?"
5 / 5 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
33
0
0
13 Jan 2025
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
11
130
0
29 Nov 2022
AlphaNet: Improved Training of Supernets with Alpha-Divergence
Dilin Wang
Chengyue Gong
Meng Li
Qiang Liu
Vikas Chandra
141
37
0
16 Feb 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
Bag of Tricks for Image Classification with Convolutional Neural Networks
Tong He
Zhi-Li Zhang
Hang Zhang
Zhongyue Zhang
Junyuan Xie
Mu Li
204
1,275
0
04 Dec 2018
1