ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.10850
  4. Cited By
The State of Knowledge Distillation for Classification

The State of Knowledge Distillation for Classification

20 December 2019
Fabian Ruffy
K. Chahal
ArXivPDFHTML

Papers citing "The State of Knowledge Distillation for Classification"

2 / 2 papers shown
Title
Mutual Distillation Learning Network for Trajectory-User Linking
Mutual Distillation Learning Network for Trajectory-User Linking
Wei-Neng Chen
Shuzhe Li
Chao Huang
Yanwei Yu
Yongguo Jiang
Junyu Dong
6
29
0
08 May 2022
Self-Distillation from the Last Mini-Batch for Consistency
  Regularization
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
60
0
30 Mar 2022
1