Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 676 papers shown
Title
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
29
32
0
04 Sep 2021
Adversarial Robustness for Unsupervised Domain Adaptation
Muhammad Awais
Fengwei Zhou
Hang Xu
Lanqing Hong
Ping Luo
Sung-Ho Bae
Zhenguo Li
20
39
0
02 Sep 2021
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Bo-wen Li
Xinyang Jiang
Donglin Bai
Yuge Zhang
Ningxin Zheng
Xuanyi Dong
Lu Liu
Yuqing Yang
Dongsheng Li
14
10
0
30 Aug 2021
CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation
Wenxuan Zou
Muyi Sun
40
9
0
27 Aug 2021
Efficient Medical Image Segmentation Based on Knowledge Distillation
Dian Qin
Jiajun Bu
Zhe Liu
Xin Shen
Sheng Zhou
Jingjun Gu
Zhihong Wang
Lei Wu
Hui-Fen Dai
30
129
0
23 Aug 2021
LIGA-Stereo: Learning LiDAR Geometry Aware Representations for Stereo-based 3D Detector
Xiaoyang Guo
Shaoshuai Shi
Xiaogang Wang
Hongsheng Li
3DPC
34
106
0
18 Aug 2021
Joint Multiple Intent Detection and Slot Filling via Self-distillation
Lisong Chen
Peilin Zhou
Yuexian Zou
VLM
16
31
0
18 Aug 2021
G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation
Lewei Yao
Renjie Pi
Hang Xu
Wei Zhang
Zhenguo Li
Tong Zhang
37
34
0
17 Aug 2021
Enhancing Self-supervised Video Representation Learning via Multi-level Feature Optimization
Rui Qian
Yuxi Li
Huabin Liu
John See
Shuangrui Ding
Xian Liu
Dian Li
Weiyao Lin
35
42
0
04 Aug 2021
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
19
93
0
04 Aug 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
32
76
0
29 Jul 2021
MFAGAN: A Compression Framework for Memory-Efficient On-Device Super-Resolution GAN
Wenlong Cheng
Mingbo Zhao
Zhiling Ye
Shuhang Gu
24
22
0
27 Jul 2021
ReSSL: Relational Self-Supervised Learning with Weak Augmentation
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Changshui Zhang
Xiaogang Wang
Chang Xu
23
113
0
20 Jul 2021
Double Similarity Distillation for Semantic Image Segmentation
Yingchao Feng
Xian Sun
Wenhui Diao
Jihao Li
Xin Gao
24
62
0
19 Jul 2021
Unpaired cross-modality educed distillation (CMEDL) for medical image segmentation
Jue Jiang
A. Rimner
Joseph O. Deasy
Harini Veeraraghavan
16
20
0
16 Jul 2021
Trustworthy AI: A Computational Perspective
Haochen Liu
Yiqi Wang
Wenqi Fan
Xiaorui Liu
Yaxin Li
Shaili Jain
Yunhao Liu
Anil K. Jain
Jiliang Tang
FaML
104
196
0
12 Jul 2021
Noise Stability Regularization for Improving BERT Fine-tuning
Hang Hua
Xingjian Li
Dejing Dou
Chengzhong Xu
Jiebo Luo
19
43
0
10 Jul 2021
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation
Bingchen Zhao
Kai Han
26
106
0
07 Jul 2021
Deep Learning for Micro-expression Recognition: A Survey
Yante Li
Jinsheng Wei
Yang Liu
Janne Kauttonen
Guoying Zhao
38
61
0
06 Jul 2021
A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation
Runze Chen
Haiyong Luo
Fang Zhao
Xuechun Meng
Zhiqing Xie
Yida Zhu
VLM
HAI
24
2
0
06 Jul 2021
On The Distribution of Penultimate Activations of Classification Networks
Minkyo Seo
Yoonho Lee
Suha Kwak
UQCV
18
4
0
05 Jul 2021
Audio-Oriented Multimodal Machine Comprehension: Task, Dataset and Model
Zhiqi Huang
Fenglin Liu
Xian Wu
Shen Ge
Helin Wang
Wei Fan
Yuexian Zou
AuLLM
29
2
0
04 Jul 2021
Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation
Zhiwei Hao
Jianyuan Guo
Ding Jia
Kai Han
Yehui Tang
Chao Zhang
Dacheng Tao
Yunhe Wang
ViT
33
68
0
03 Jul 2021
Simple Distillation Baselines for Improving Small Self-supervised Models
Jindong Gu
Wei Liu
Yonglong Tian
21
8
0
21 Jun 2021
Knowledge Distillation via Instance-level Sequence Learning
Haoran Zhao
Xin Sun
Junyu Dong
Zihe Dong
Qiong Li
34
23
0
21 Jun 2021
The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective
Geoff Pleiss
John P. Cunningham
28
24
0
11 Jun 2021
Knowledge distillation: A good teacher is patient and consistent
Lucas Beyer
Xiaohua Zhai
Amelie Royer
L. Markeeva
Rohan Anil
Alexander Kolesnikov
VLM
50
287
0
09 Jun 2021
Privileged Graph Distillation for Cold Start Recommendation
Shuai Wang
Kun Zhang
Le Wu
Haiping Ma
Richang Hong
Meng Wang
12
28
0
31 May 2021
Fair Feature Distillation for Visual Recognition
S. Jung
Donggyu Lee
Taeeon Park
Taesup Moon
27
76
0
27 May 2021
Divide and Contrast: Self-supervised Learning from Uncurated Data
Yonglong Tian
Olivier J. Hénaff
Aaron van den Oord
SSL
64
96
0
17 May 2021
Graph-Free Knowledge Distillation for Graph Neural Networks
Xiang Deng
Zhongfei Zhang
34
65
0
16 May 2021
Carrying out CNN Channel Pruning in a White Box
Yuxin Zhang
Mingbao Lin
Chia-Wen Lin
Jie Chen
Feiyue Huang
Yongjian Wu
Yonghong Tian
Rongrong Ji
VLM
39
58
0
24 Apr 2021
Balanced Knowledge Distillation for Long-tailed Learning
Shaoyu Zhang
Chen Chen
Xiyuan Hu
Silong Peng
48
57
0
21 Apr 2021
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Shivaram Bhat
Elahe Arani
Bahram Zonooz
SSL
22
28
0
20 Apr 2021
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
54
40
0
19 Apr 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
423
0
19 Apr 2021
End-to-End Interactive Prediction and Planning with Optical Flow Distillation for Autonomous Driving
Hengli Wang
Peide Cai
Rui Fan
Yuxiang Sun
Ming Liu
40
23
0
18 Apr 2021
Lottery Jackpots Exist in Pre-trained Models
Yuxin Zhang
Mingbao Lin
Yan Wang
Rongrong Ji
Rongrong Ji
30
15
0
18 Apr 2021
MRI-based Alzheimer's disease prediction via distilling the knowledge in multi-modal data
Hao Guan
Chaoyue Wang
Dacheng Tao
18
30
0
08 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
41
22
0
07 Apr 2021
SIMPLE: SIngle-network with Mimicking and Point Learning for Bottom-up Human Pose Estimation
Jiabin Zhang
Zheng Zhu
Jiwen Lu
Junjie Huang
Guan Huang
Jie Zhou
3DH
29
14
0
06 Apr 2021
Learning from Self-Discrepancy via Multiple Co-teaching for Cross-Domain Person Re-Identification
Suncheng Xiang
Yuzhuo Fu
Mengyuan Guan
Ting Liu
32
22
0
06 Apr 2021
Content-Aware GAN Compression
Yuchen Liu
Zhixin Shu
Yijun Li
Zhe-nan Lin
Federico Perazzi
S. Kung
GAN
35
58
0
06 Apr 2021
Going deeper with Image Transformers
Hugo Touvron
Matthieu Cord
Alexandre Sablayrolles
Gabriel Synnaeve
Hervé Jégou
ViT
27
986
0
31 Mar 2021
Complementary Relation Contrastive Distillation
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
27
77
0
29 Mar 2021
Distilling Object Detectors via Decoupled Features
Jianyuan Guo
Kai Han
Yunhe Wang
Han Wu
Xinghao Chen
Chunjing Xu
Chang Xu
43
199
0
26 Mar 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
27
46
0
26 Mar 2021
Universal Representation Learning from Multiple Domains for Few-shot Classification
Weihong Li
Xialei Liu
Hakan Bilen
SSL
OOD
VLM
30
84
0
25 Mar 2021
Learning Scene Structure Guidance via Cross-Task Knowledge Transfer for Single Depth Super-Resolution
Baoli Sun
Xinchen Ye
Baopu Li
Haojie Li
Zhihui Wang
Rui Xu
11
44
0
24 Mar 2021
Previous
1
2
3
...
7
8
9
...
12
13
14
Next