ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.10163
  4. Cited By
A Closer Look at Knowledge Distillation with Features, Logits, and
  Gradients

A Closer Look at Knowledge Distillation with Features, Logits, and Gradients

18 March 2022
Yen-Chang Hsu
James Smith
Yilin Shen
Z. Kira
Hongxia Jin
ArXivPDFHTML

Papers citing "A Closer Look at Knowledge Distillation with Features, Logits, and Gradients"

4 / 4 papers shown
Title
Harmonizing knowledge Transfer in Neural Network with Unified
  Distillation
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Chaomin Shen
Faming Fang
Guixu Zhang
29
0
0
27 Sep 2024
Semantic Scene Completion with Cleaner Self
Semantic Scene Completion with Cleaner Self
Fengyun Wang
Dong Zhang
Hanwang Zhang
Jinhui Tang
Qianru Sun
23
12
0
17 Mar 2023
Goal-Conditioned Q-Learning as Knowledge Distillation
Goal-Conditioned Q-Learning as Knowledge Distillation
Alexander Levine
S. Feizi
OffRL
17
2
0
28 Aug 2022
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,561
0
17 Apr 2017
1