ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.10419
  4. Cited By
Why distillation helps: a statistical perspective

Why distillation helps: a statistical perspective

21 May 2020
A. Menon
A. S. Rawat
Sashank J. Reddi
Seungyeon Kim
Sanjiv Kumar
    FedML
ArXivPDFHTML

Papers citing "Why distillation helps: a statistical perspective"

7 / 7 papers shown
Title
High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws
High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws
M. E. Ildiz
Halil Alperen Gozeten
Ege Onur Taga
Marco Mondelli
Samet Oymak
54
2
0
24 Oct 2024
Knowledge Distillation for Oriented Object Detection on Aerial Images
Knowledge Distillation for Oriented Object Detection on Aerial Images
Yicheng Xiao
Junpeng Zhang
ObjD
19
0
0
20 Jun 2022
Towards Model Agnostic Federated Learning Using Knowledge Distillation
Towards Model Agnostic Federated Learning Using Knowledge Distillation
A. Afonin
Sai Praneeth Karimireddy
FedML
30
44
0
28 Oct 2021
Knowledge Distillation as Semiparametric Inference
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
22
31
0
20 Apr 2021
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with
  Reliable Transfer for Cardiac Segmentation
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac Segmentation
Kang Li
Shujun Wang
Lequan Yu
Pheng-Ann Heng
60
28
0
07 Jan 2021
Self-Distillation Amplifies Regularization in Hilbert Space
Self-Distillation Amplifies Regularization in Hilbert Space
H. Mobahi
Mehrdad Farajtabar
Peter L. Bartlett
19
226
0
13 Feb 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1