Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.04163
Cited By
Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
8 April 2019
Yangyang Shi
M. Hwang
X. Lei
Haoyu Sheng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization"
4 / 4 papers shown
Title
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Kuluhan Binici
N. Pham
T. Mitra
K. Leman
12
40
0
11 Aug 2021
Spectral Pruning for Recurrent Neural Networks
Takashi Furuya
Kazuma Suetake
K. Taniguchi
Hiroyuki Kusumoto
Ryuji Saiin
Tomohiro Daimon
22
4
0
23 May 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,832
0
09 Jun 2020
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1