ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.13093
  4. Cited By
Towards Understanding Knowledge Distillation

Towards Understanding Knowledge Distillation

27 May 2021
Mary Phuong
Christoph H. Lampert
ArXivPDFHTML

Papers citing "Towards Understanding Knowledge Distillation"

20 / 70 papers shown
Title
Universal Representation Learning from Multiple Domains for Few-shot
  Classification
Universal Representation Learning from Multiple Domains for Few-shot Classification
Weihong Li
Xialei Liu
Hakan Bilen
SSL
OOD
VLM
30
84
0
25 Mar 2021
MalBERT: Using Transformers for Cybersecurity and Malicious Software
  Detection
MalBERT: Using Transformers for Cybersecurity and Malicious Software Detection
Abir Rahali
M. Akhloufi
32
30
0
05 Mar 2021
Investigating Bi-Level Optimization for Learning and Vision from a
  Unified Perspective: A Survey and Beyond
Investigating Bi-Level Optimization for Learning and Vision from a Unified Perspective: A Survey and Beyond
Risheng Liu
Jiaxin Gao
Jin Zhang
Deyu Meng
Zhouchen Lin
AI4CE
62
223
0
27 Jan 2021
Robustness and Diversity Seeking Data-Free Knowledge Distillation
Robustness and Diversity Seeking Data-Free Knowledge Distillation
Pengchao Han
Jihong Park
Shiqiang Wang
Yejun Liu
15
12
0
07 Nov 2020
Federated Knowledge Distillation
Federated Knowledge Distillation
Hyowoon Seo
Jihong Park
Seungeun Oh
M. Bennis
Seong-Lyun Kim
FedML
36
91
0
04 Nov 2020
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data
  Efficiency and Imperfect Teacher
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher
Guangda Ji
Zhanxing Zhu
59
42
0
20 Oct 2020
Communication-Efficient and Distributed Learning Over Wireless Networks:
  Principles and Applications
Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications
Jihong Park
S. Samarakoon
Anis Elgabli
Joongheon Kim
M. Bennis
Seong-Lyun Kim
Mérouane Debbah
34
161
0
06 Aug 2020
Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection
Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection
Cong Chen
Shouyang Dong
Ye Tian
K. Cao
Li Liu
Yuanhao Guo
28
28
0
13 Jul 2020
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Z. Su
Linpu Fang
Wenxiong Kang
D. Hu
M. Pietikäinen
Li Liu
15
44
0
08 Jul 2020
Interpreting and Disentangling Feature Components of Various Complexity
  from DNNs
Interpreting and Disentangling Feature Components of Various Complexity from DNNs
Jie Ren
Mingjie Li
Zexu Liu
Quanshi Zhang
CoGe
19
18
0
29 Jun 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
21
2,851
0
09 Jun 2020
Self-Distillation as Instance-Specific Label Smoothing
Self-Distillation as Instance-Specific Label Smoothing
Zhilu Zhang
M. Sabuncu
20
116
0
09 Jun 2020
An Empirical Analysis of the Impact of Data Augmentation on Knowledge
  Distillation
An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation
Deepan Das
Haley Massa
Abhimanyu Kulkarni
Theodoros Rekatsinas
29
18
0
06 Jun 2020
An Overview of Neural Network Compression
An Overview of Neural Network Compression
James OÑeill
AI4CE
45
98
0
05 Jun 2020
COVID-MobileXpert: On-Device COVID-19 Patient Triage and Follow-up using
  Chest X-rays
COVID-MobileXpert: On-Device COVID-19 Patient Triage and Follow-up using Chest X-rays
Xin Li
Chengyin Li
D. Zhu
24
79
0
06 Apr 2020
Analysis of Knowledge Transfer in Kernel Regime
Analysis of Knowledge Transfer in Kernel Regime
Arman Rahbar
Ashkan Panahi
Chiranjib Bhattacharyya
Devdatt Dubhashi
M. Chehreghani
23
3
0
30 Mar 2020
Self-Distillation Amplifies Regularization in Hilbert Space
Self-Distillation Amplifies Regularization in Hilbert Space
H. Mobahi
Mehrdad Farajtabar
Peter L. Bartlett
33
227
0
13 Feb 2020
Understanding and Improving Knowledge Distillation
Understanding and Improving Knowledge Distillation
Jiaxi Tang
Rakesh Shivanna
Zhe Zhao
Dong Lin
Anima Singh
Ed H. Chi
Sagar Jain
27
129
0
10 Feb 2020
Search to Distill: Pearls are Everywhere but not the Eyes
Search to Distill: Pearls are Everywhere but not the Eyes
Yu Liu
Xuhui Jia
Mingxing Tan
Raviteja Vemulapalli
Yukun Zhu
Bradley Green
Xiaogang Wang
30
68
0
20 Nov 2019
Copying Machine Learning Classifiers
Copying Machine Learning Classifiers
Irene Unceta
Jordi Nin
O. Pujol
14
18
0
05 Mar 2019
Previous
12