Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.07650
Cited By
Learning Student-Friendly Teacher Networks for Knowledge Distillation
12 February 2021
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Learning Student-Friendly Teacher Networks for Knowledge Distillation"
5 / 5 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
31
0
0
13 Jan 2025
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
33
0
0
12 Nov 2024
Collaborative Learning for Enhanced Unsupervised Domain Adaptation
Minhee Cho
Hyesong Choi
Hayeon Jo
Dongbo Min
23
1
0
04 Sep 2024
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
Huan Wang
Yijun Li
Yuehai Wang
Haoji Hu
Ming-Hsuan Yang
107
98
0
18 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
436
0
12 Jun 2018
1