ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.12965
  4. Cited By
BD-KD: Balancing the Divergences for Online Knowledge Distillation

BD-KD: Balancing the Divergences for Online Knowledge Distillation

25 December 2022
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
ArXivPDFHTML

Papers citing "BD-KD: Balancing the Divergences for Online Knowledge Distillation"

3 / 3 papers shown
Title
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
33
0
0
06 Jan 2025
Rethinking Kullback-Leibler Divergence in Knowledge Distillation for
  Large Language Models
Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
Taiqiang Wu
Chaofan Tao
Jiahao Wang
Zhe Zhao
Ngai Wong
ALM
46
14
0
03 Apr 2024
Switchable Online Knowledge Distillation
Switchable Online Knowledge Distillation
Biao Qian
Yang Wang
Hongzhi Yin
Richang Hong
Meng Wang
56
38
0
12 Sep 2022
1