ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.03810
  4. Cited By
An Empirical Analysis of the Impact of Data Augmentation on Knowledge
  Distillation

An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation

6 June 2020
Deepan Das
Haley Massa
Abhimanyu Kulkarni
Theodoros Rekatsinas
ArXivPDFHTML

Papers citing "An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation"

4 / 4 papers shown
Title
Distilling Calibrated Student from an Uncalibrated Teacher
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
27
2
0
22 Feb 2023
Curriculum Temperature for Knowledge Distillation
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
22
132
0
29 Nov 2022
Meta Knowledge Distillation
Meta Knowledge Distillation
Jihao Liu
Boxiao Liu
Hongsheng Li
Yu Liu
18
25
0
16 Feb 2022
Isotonic Data Augmentation for Knowledge Distillation
Isotonic Data Augmentation for Knowledge Distillation
Wanyun Cui
Sen Yan
8
6
0
03 Jul 2021
1