Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2303.14666
Cited By
Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
26 March 2023
Tianli Zhang
Mengqi Xue
Jiangtao Zhang
Haofei Zhang
Yu Wang
Lechao Cheng
Jie Song
Mingli Song
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation"
7 / 7 papers shown
Title
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
42
0
0
09 Mar 2025
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Y. Xu
28
0
0
22 Feb 2025
Progressive Feature Self-reinforcement for Weakly Supervised Semantic Segmentation
Jingxuan He
Lechao Cheng
Chaowei Fang
Zunlei Feng
Tingting Mu
Min-Gyoo Song
11
7
0
14 Dec 2023
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
244
35,884
0
25 Aug 2016
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
279
39,083
0
01 Sep 2014
1