ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.04163
  4. Cited By
Knowledge Distillation For Recurrent Neural Network Language Modeling
  With Trust Regularization

Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

8 April 2019
Yangyang Shi
M. Hwang
X. Lei
Haoyu Sheng
ArXivPDFHTML

Papers citing "Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization"

4 / 4 papers shown
Title
Preventing Catastrophic Forgetting and Distribution Mismatch in
  Knowledge Distillation via Synthetic Data
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Kuluhan Binici
N. Pham
T. Mitra
K. Leman
12
40
0
11 Aug 2021
Spectral Pruning for Recurrent Neural Networks
Spectral Pruning for Recurrent Neural Networks
Takashi Furuya
Kazuma Suetake
K. Taniguchi
Hiroyuki Kusumoto
Ryuji Saiin
Tomohiro Daimon
22
4
0
23 May 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,832
0
09 Jun 2020
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1