Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.13934
Cited By
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
23 November 2023
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning"
6 / 6 papers shown
Title
Honey, I Shrunk the Language Model: Impact of Knowledge Distillation Methods on Performance and Explainability
Daniel Hendriks
Philipp Spitzer
Niklas Kühl
G. Satzger
27
0
0
22 Apr 2025
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
13
2
0
24 Nov 2023
Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again
Xin-Chun Li
Wenxuan Fan
Shaoming Song
Yinchuan Li
Bingshuai Li
Yunfeng Shao
De-Chuan Zhan
21
29
0
10 Oct 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
Huan Wang
Yijun Li
Yuehai Wang
Haoji Hu
Ming-Hsuan Yang
107
98
0
18 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1