ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.13941
  4. Cited By
SMILE: Self-Distilled MIxup for Efficient Transfer LEarning

SMILE: Self-Distilled MIxup for Efficient Transfer LEarning

25 March 2021
Xingjian Li
Haoyi Xiong
Chengzhong Xu
Dejing Dou
ArXiv (abs)PDFHTML

Papers citing "SMILE: Self-Distilled MIxup for Efficient Transfer LEarning"

2 / 2 papers shown
Distilling Calibrated Student from an Uncalibrated Teacher
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
165
3
0
22 Feb 2023
Understanding the Role of Mixup in Knowledge Distillation: An Empirical
  Study
Understanding the Role of Mixup in Knowledge Distillation: An Empirical StudyIEEE Workshop/Winter Conference on Applications of Computer Vision (WACV), 2022
Hongjun Choi
Eunyeong Jeon
Ankita Shukla
Pavan Turaga
190
9
0
08 Nov 2022
1