Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.14473
Cited By
Distilling a Powerful Student Model via Online Knowledge Distillation
26 March 2021
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling a Powerful Student Model via Online Knowledge Distillation"
5 / 5 papers shown
Title
FEDS: Feature and Entropy-Based Distillation Strategy for Efficient Learned Image Compression
H. Fu
Jie Liang
Zhenman Fang
Jingning Han
36
0
0
09 Mar 2025
DiReDi: Distillation and Reverse Distillation for AIoT Applications
Chen Sun
Qing Tong
Wenshuang Yang
Wenqi Zhang
23
0
0
12 Sep 2024
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
22
49
0
23 Jul 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
1