Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.12965
Cited By
BD-KD: Balancing the Divergences for Online Knowledge Distillation
25 December 2022
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
Re-assign community
ArXiv
PDF
HTML
Papers citing
"BD-KD: Balancing the Divergences for Online Knowledge Distillation"
3 / 3 papers shown
Title
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
33
0
0
06 Jan 2025
Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Taiqiang Wu
Chaofan Tao
Jiahao Wang
Zhe Zhao
Ngai Wong
ALM
46
14
0
03 Apr 2024
Switchable Online Knowledge Distillation
Biao Qian
Yang Wang
Hongzhi Yin
Richang Hong
Meng Wang
56
38
0
12 Sep 2022
1