Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 667 papers shown
Title
FEED: Feature-level Ensemble for Knowledge Distillation
Seonguk Park
Nojun Kwak
FedML
20
41
0
24 Sep 2019
CNN-based RGB-D Salient Object Detection: Learn, Select and Fuse
Hao Chen
Youfu Li
ObjD
24
23
0
20 Sep 2019
Hint-Based Training for Non-Autoregressive Machine Translation
Zhuohan Li
Zi Lin
Di He
Fei Tian
Tao Qin
Liwei Wang
Tie-Yan Liu
31
72
0
15 Sep 2019
Deep Elastic Networks with Model Selection for Multi-Task Learning
Chanho Ahn
Eunwoo Kim
Songhwai Oh
49
49
0
11 Sep 2019
Knowledge Transfer Graph for Deep Collaborative Learning
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
30
9
0
10 Sep 2019
Extreme Low Resolution Activity Recognition with Confident Spatial-Temporal Attention Transfer
Yucai Bai
Qinglong Zou
Xieyuanli Chen
Lingxi Li
Zhengming Ding
Long Chen
18
3
0
09 Sep 2019
Knowledge Distillation for End-to-End Person Search
Bharti Munjal
Fabio Galasso
S. Amin
FedML
40
15
0
03 Sep 2019
Patient Knowledge Distillation for BERT Model Compression
S. Sun
Yu Cheng
Zhe Gan
Jingjing Liu
51
828
0
25 Aug 2019
Progressive Face Super-Resolution via Attention to Facial Landmark
Deok-Hun Kim
Minseon Kim
Gihyun Kwon
Daeshik Kim
SupR
CVBM
19
135
0
22 Aug 2019
MobileFAN: Transferring Deep Hidden Representation for Face Alignment
Yang Zhao
Yifan Liu
Chunhua Shen
Yongsheng Gao
Shengwu Xiong
CVBM
27
39
0
11 Aug 2019
Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations
Bohan Zhuang
Jing Liu
Mingkui Tan
Lingqiao Liu
Ian Reid
Chunhua Shen
MQ
29
44
0
10 Aug 2019
Distilled Siamese Networks for Visual Tracking
Jianbing Shen
Yuanpei Liu
Xingping Dong
Xiankai Lu
Fahad Shahbaz Khan
Guosheng Lin
15
101
0
24 Jul 2019
Similarity-Preserving Knowledge Distillation
Frederick Tung
Greg Mori
43
959
0
23 Jul 2019
Distill-2MD-MTL: Data Distillation based on Multi-Dataset Multi-Domain Multi-Task Frame Work to Solve Face Related Tasksks, Multi Task Learning, Semi-Supervised Learning
Sepidehsadat Hosseini
M. Shabani
N. Cho
CVBM
36
3
0
08 Jul 2019
GAN-Knowledge Distillation for one-stage Object Detection
Wanwei Wang
Jin ke Yu Fan Zong
ObjD
19
28
0
20 Jun 2019
Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
Ahmed T. Elthakeb
Prannoy Pilligundla
Alex Cloninger
H. Esmaeilzadeh
MQ
26
8
0
14 Jun 2019
Distilling Object Detectors with Fine-grained Feature Imitation
Tao Wang
Li-xin Yuan
Xiaopeng Zhang
Jiashi Feng
ObjD
13
377
0
09 Jun 2019
Efficient Object Embedding for Spliced Image Retrieval
Bor-Chun Chen
Zuxuan Wu
L. Davis
Ser-Nam Lim
32
8
0
28 May 2019
Zero-shot Knowledge Transfer via Adversarial Belief Matching
P. Micaelli
Amos Storkey
17
228
0
23 May 2019
Lightweight Network Architecture for Real-Time Action Recognition
Alexander Kozlov
Vadim Andronov
Y. Gritsenko
ViT
25
33
0
21 May 2019
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
Linfeng Zhang
Jiebo Song
Anni Gao
Jingwei Chen
Chenglong Bao
Kaisheng Ma
FedML
27
844
0
17 May 2019
Dynamic Neural Network Channel Execution for Efficient Training
Simeon E. Spasov
Pietro Lió
19
4
0
15 May 2019
Learning What and Where to Transfer
Yunhun Jang
Hankook Lee
Sung Ju Hwang
Jinwoo Shin
11
148
0
15 May 2019
Object Detection in 20 Years: A Survey
Zhengxia Zou
Keyan Chen
Zhenwei Shi
Yuhong Guo
Jieping Ye
VLM
ObjD
AI4TS
32
2,285
0
13 May 2019
High Frequency Residual Learning for Multi-Scale Image Classification
Bowen Cheng
Rong Xiao
Jianfeng Wang
Thomas Huang
Lei Zhang
28
21
0
07 May 2019
Similarity of Neural Network Representations Revisited
Simon Kornblith
Mohammad Norouzi
Honglak Lee
Geoffrey E. Hinton
37
1,357
0
01 May 2019
TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks
Md. Akmal Haidar
Mehdi Rezagholizadeh
28
52
0
23 Apr 2019
Student Becoming the Master: Knowledge Amalgamation for Joint Scene Parsing, Depth Estimation, and More
Jingwen Ye
Yixin Ji
Xinchao Wang
Kairi Ou
Dapeng Tao
Xiuming Zhang
MoMe
21
75
0
23 Apr 2019
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
26
91
0
19 Apr 2019
End-to-End Speech Translation with Knowledge Distillation
Yuchen Liu
Hao Xiong
Zhongjun He
Jiajun Zhang
Hua Wu
Haifeng Wang
Chengqing Zong
32
151
0
17 Apr 2019
Biphasic Learning of GANs for High-Resolution Image-to-Image Translation
Jie Cao
Huaibo Huang
Yi Li
Jingtuo Liu
Ran He
Zhenan Sun
GAN
26
4
0
14 Apr 2019
Variational Information Distillation for Knowledge Transfer
Sungsoo Ahn
S. Hu
Andreas C. Damianou
Neil D. Lawrence
Zhenwen Dai
6
609
0
11 Apr 2019
Spatiotemporal Knowledge Distillation for Efficient Estimation of Aerial Video Saliency
Jia Li
K. Fu
Shengwei Zhao
Shiming Ge
38
26
0
10 Apr 2019
Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Yangyang Shi
M. Hwang
X. Lei
Haoyu Sheng
26
25
0
08 Apr 2019
Semantic-Aware Knowledge Preservation for Zero-Shot Sketch-Based Image Retrieval
Qing Liu
Lingxi Xie
Huiyu Wang
Alan Yuille
VLM
21
108
0
05 Apr 2019
Branched Multi-Task Networks: Deciding What Layers To Share
Simon Vandenhende
Stamatios Georgoulis
Bert De Brabandere
Luc Van Gool
25
145
0
05 Apr 2019
Correlation Congruence for Knowledge Distillation
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
63
507
0
03 Apr 2019
Training Quantized Neural Networks with a Full-precision Auxiliary Module
Bohan Zhuang
Lingqiao Liu
Mingkui Tan
Chunhua Shen
Ian Reid
MQ
32
62
0
27 Mar 2019
Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness
Jiawang Bai
Yiming Li
Jiawei Li
Yong Jiang
Shutao Xia
29
14
0
14 Mar 2019
Structured Knowledge Distillation for Dense Prediction
Yifan Liu
Chris Liu
Jingdong Wang
Zhenbo Luo
27
575
0
11 Mar 2019
A Learnable ScatterNet: Locally Invariant Convolutional Layers
Fergal Cotter
N. Kingsbury
17
22
0
07 Mar 2019
Copying Machine Learning Classifiers
Irene Unceta
Jordi Nin
O. Pujol
14
18
0
05 Mar 2019
Efficient Video Classification Using Fewer Frames
S. Bhardwaj
Mukundhan Srinivasan
Mitesh M. Khapra
40
88
0
27 Feb 2019
Multilingual Neural Machine Translation with Knowledge Distillation
Xu Tan
Yi Ren
Di He
Tao Qin
Zhou Zhao
Tie-Yan Liu
20
248
0
27 Feb 2019
XONN: XNOR-based Oblivious Deep Neural Network Inference
M. Riazi
Mohammad Samragh
Hao Chen
Kim Laine
Kristin E. Lauter
F. Koushanfar
FedML
GNN
BDL
22
280
0
19 Feb 2019
Slimmable Neural Networks
Jiahui Yu
L. Yang
N. Xu
Jianchao Yang
Thomas Huang
22
547
0
21 Dec 2018
Accelerating Large Scale Knowledge Distillation via Dynamic Importance Sampling
Minghan Li
Tanli Zuo
Ruicheng Li
Martha White
Weishi Zheng
29
3
0
03 Dec 2018
Knowledge Distillation with Feature Maps for Image Classification
Wei-Chun Chen
Chia-Che Chang
Chien-Yu Lu
Che-Rung Lee
29
35
0
03 Dec 2018
Data Augmentation using Random Image Cropping and Patching for Deep CNNs
Ryo Takahashi
Takashi Matsubara
K. Uehara
22
326
0
22 Nov 2018
Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models
Pengyuan Ren
Jianmin Li
25
9
0
20 Nov 2018
Previous
1
2
3
...
10
11
12
13
14
Next