ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.04120
  4. Cited By
On the Orthogonality of Knowledge Distillation with Other Techniques:
  From an Ensemble Perspective

On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective

9 September 2020
Seonguk Park
Kiyoon Yoo
Nojun Kwak
    FedML
ArXivPDFHTML

Papers citing "On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective"

3 / 3 papers shown
Title
Topological Persistence Guided Knowledge Distillation for Wearable
  Sensor Data
Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data
Eun Som Jeon
Hongjun Choi
A. Shukla
Yuan Wang
Hyunglae Lee
M. Buman
P. Turaga
27
3
0
07 Jul 2024
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
297
10,216
0
16 Nov 2016
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
269
5,326
0
05 Nov 2016
1