ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.00329
  4. Cited By
Student Helping Teacher: Teacher Evolution via Self-Knowledge
  Distillation

Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation

1 October 2021
Zheng Li
Xiang Li
Lingfeng Yang
Jian Yang
Zhigeng Pan
ArXivPDFHTML

Papers citing "Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation"

2 / 2 papers shown
Title
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
1