Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.05835
Cited By
Variational Information Distillation for Knowledge Transfer
11 April 2019
Sungsoo Ahn
S. Hu
Andreas C. Damianou
Neil D. Lawrence
Zhenwen Dai
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Variational Information Distillation for Knowledge Transfer"
21 / 321 papers shown
Title
Transferring Inductive Biases through Knowledge Distillation
Samira Abnar
Mostafa Dehghani
Willem H. Zuidema
33
57
0
31 May 2020
Learning from a Lightweight Teacher for Efficient Knowledge Distillation
Yuang Liu
Wei Zhang
Jun Wang
6
3
0
19 May 2020
Data-Free Network Quantization With Adversarial Knowledge Distillation
Yoojin Choi
Jihwan P. Choi
Mostafa El-Khamy
Jungwon Lee
MQ
27
119
0
08 May 2020
Heterogeneous Knowledge Distillation using Information Flow Modeling
Nikolaos Passalis
Maria Tzelepi
Anastasios Tefas
18
138
0
02 May 2020
Can a powerful neural network be a teacher for a weaker neural network?
Nicola Landro
I. Gallo
Riccardo La Grassa
11
0
0
01 May 2020
Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
R. K. Kushawaha
S. Kumar
Biplab Banerjee
R. Velmurugan
8
31
0
01 May 2020
Regularizing Class-wise Predictions via Self-knowledge Distillation
Sukmin Yun
Jongjin Park
Kimin Lee
Jinwoo Shin
29
274
0
31 Mar 2020
Monocular Depth Estimation Based On Deep Learning: An Overview
Chaoqiang Zhao
Qiyu Sun
Chongzhen Zhang
Yang Tang
Feng Qian
MDE
65
251
0
14 Mar 2020
SuperMix: Supervising the Mixing Data Augmentation
Ali Dabouei
Sobhan Soleymani
Fariborz Taherkhani
Nasser M. Nasrabadi
19
98
0
10 Mar 2020
Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs
Sangwoo Mo
Minsu Cho
Jinwoo Shin
30
212
0
25 Feb 2020
Self-Distillation Amplifies Regularization in Hilbert Space
H. Mobahi
Mehrdad Farajtabar
Peter L. Bartlett
33
227
0
13 Feb 2020
Subclass Distillation
Rafael Müller
Simon Kornblith
Geoffrey E. Hinton
28
33
0
10 Feb 2020
Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation
Sajjad Abbasi
M. Hajabdollahi
N. Karimi
S. Samavi
10
28
0
31 Dec 2019
Real-time Policy Distillation in Deep Reinforcement Learning
Yuxiang Sun
Pooyan Fazli
9
6
0
29 Dec 2019
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
28
553
0
18 Dec 2019
QUEST: Quantized embedding space for transferring knowledge
Himalaya Jain
Spyros Gidaris
N. Komodakis
P. Pérez
Matthieu Cord
18
14
0
03 Dec 2019
Online Knowledge Distillation with Diverse Peers
Defang Chen
Jian-Ping Mei
Can Wang
Yan Feng
Chun-Yen Chen
FedML
9
297
0
01 Dec 2019
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
25
68
0
18 Nov 2019
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
47
1,031
0
23 Oct 2019
Training convolutional neural networks with cheap convolutions and online distillation
Jiao Xie
Shaohui Lin
Yichen Zhang
Linkai Luo
19
12
0
28 Sep 2019
Zero-shot Knowledge Transfer via Adversarial Belief Matching
P. Micaelli
Amos Storkey
19
228
0
23 May 2019
Previous
1
2
3
4
5
6
7