ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.11612
  4. Cited By
Search for Better Students to Learn Distilled Knowledge

Search for Better Students to Learn Distilled Knowledge

30 January 2020
Jindong Gu
Volker Tresp
ArXivPDFHTML

Papers citing "Search for Better Students to Learn Distilled Knowledge"

6 / 6 papers shown
Title
Explainability and Robustness of Deep Visual Classification Models
Explainability and Robustness of Deep Visual Classification Models
Jindong Gu
AAML
36
2
0
03 Jan 2023
Design Automation for Fast, Lightweight, and Effective Deep Learning
  Models: A Survey
Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey
Dalin Zhang
Kaixuan Chen
Yan Zhao
B. Yang
Li-Ping Yao
Christian S. Jensen
40
3
0
22 Aug 2022
Simple Distillation Baselines for Improving Small Self-supervised Models
Simple Distillation Baselines for Improving Small Self-supervised Models
Jindong Gu
Wei Liu
Yonglong Tian
19
8
0
21 Jun 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
ResKD: Residual-Guided Knowledge Distillation
ResKD: Residual-Guided Knowledge Distillation
Xuewei Li
Songyuan Li
Bourahla Omar
Fei Wu
Xi Li
21
47
0
08 Jun 2020
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1