ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.14473
  4. Cited By
Distilling a Powerful Student Model via Online Knowledge Distillation

Distilling a Powerful Student Model via Online Knowledge Distillation

26 March 2021
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
    FedML
ArXivPDFHTML

Papers citing "Distilling a Powerful Student Model via Online Knowledge Distillation"

5 / 5 papers shown
Title
FEDS: Feature and Entropy-Based Distillation Strategy for Efficient Learned Image Compression
H. Fu
Jie Liang
Zhenman Fang
Jingning Han
36
0
0
09 Mar 2025
DiReDi: Distillation and Reverse Distillation for AIoT Applications
DiReDi: Distillation and Reverse Distillation for AIoT Applications
Chen Sun
Qing Tong
Wenshuang Yang
Wenqi Zhang
23
0
0
12 Sep 2024
Online Knowledge Distillation via Mutual Contrastive Learning for Visual
  Recognition
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
22
49
0
23 Jul 2022
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
1