Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.10419
Cited By
Why distillation helps: a statistical perspective
21 May 2020
A. Menon
A. S. Rawat
Sashank J. Reddi
Seungyeon Kim
Sanjiv Kumar
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Why distillation helps: a statistical perspective"
7 / 7 papers shown
Title
High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws
M. E. Ildiz
Halil Alperen Gozeten
Ege Onur Taga
Marco Mondelli
Samet Oymak
54
2
0
24 Oct 2024
Knowledge Distillation for Oriented Object Detection on Aerial Images
Yicheng Xiao
Junpeng Zhang
ObjD
19
0
0
20 Jun 2022
Towards Model Agnostic Federated Learning Using Knowledge Distillation
A. Afonin
Sai Praneeth Karimireddy
FedML
30
44
0
28 Oct 2021
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
22
31
0
20 Apr 2021
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac Segmentation
Kang Li
Shujun Wang
Lequan Yu
Pheng-Ann Heng
60
28
0
07 Jan 2021
Self-Distillation Amplifies Regularization in Hilbert Space
H. Mobahi
Mehrdad Farajtabar
Peter L. Bartlett
19
226
0
13 Feb 2020
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1