Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.14334
Cited By
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
24 November 2023
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Maximizing Discrimination Capability of Knowledge Distillation with Energy Function"
6 / 6 papers shown
Title
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
17
2
0
23 Nov 2023
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
44
18
0
24 Apr 2023
Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again
Xin-Chun Li
Wenxuan Fan
Shaoming Song
Yinchuan Li
Bingshuai Li
Yunfeng Shao
De-Chuan Zhan
29
29
0
10 Oct 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
113
99
0
12 Feb 2021
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
279
39,083
0
01 Sep 2014
1