Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.01688
Cited By
Knowledge distillation for optimization of quantized deep neural networks
4 September 2019
Sungho Shin
Yoonho Boo
Wonyong Sung
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge distillation for optimization of quantized deep neural networks"
4 / 4 papers shown
Title
ERNIE-Tiny : A Progressive Distillation Framework for Pretrained Transformer Compression
Weiyue Su
Xuyi Chen
Shi Feng
Jiaxiang Liu
Weixin Liu
Yu Sun
Hao Tian
Hua Wu
Haifeng Wang
34
13
0
04 Jun 2021
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks
Yoonho Boo
Sungho Shin
Jungwook Choi
Wonyong Sung
MQ
30
29
0
30 Sep 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,851
0
09 Jun 2020
Neural Compatibility Modeling with Attentive Knowledge Distillation
Xuemeng Song
Fuli Feng
Xianjing Han
Xin Yang
Wei Liu
Liqiang Nie
45
144
0
17 Apr 2018
1