ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.09903
  4. Cited By
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning
  with Self-Knowledge Distillation

Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation

17 March 2021
Md. Akmal Haidar
Chao Xing
Mehdi Rezagholizadeh
ArXivPDFHTML

Papers citing "Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation"

2 / 2 papers shown
Title
FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech
  Self-Supervised Learning
FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning
Yeonghyeon Lee
Kangwook Jang
Jahyun Goo
Youngmoon Jung
Hoi-Rim Kim
10
28
0
01 Jul 2022
ATCSpeechNet: A multilingual end-to-end speech recognition framework for
  air traffic control systems
ATCSpeechNet: A multilingual end-to-end speech recognition framework for air traffic control systems
Yi Lin
Bo Yang
Linchao Li
Dongyue Guo
Jianwei Zhang
Hu Chen
Yi Zhang
24
29
0
17 Feb 2021
1