ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.02212
  4. Cited By
Improved training for online end-to-end speech recognition systems

Improved training for online end-to-end speech recognition systems

6 November 2017
Suyoun Kim
M. Seltzer
Jinyu Li
Rui Zhao
ArXivPDFHTML

Papers citing "Improved training for online end-to-end speech recognition systems"

6 / 6 papers shown
Title
Knowledge Distillation from Non-streaming to Streaming ASR Encoder using
  Auxiliary Non-streaming Layer
Knowledge Distillation from Non-streaming to Streaming ASR Encoder using Auxiliary Non-streaming Layer
Kyuhong Shim
Jinkyu Lee
Simyoung Chang
Kyuwoong Hwang
42
2
0
31 Aug 2023
Mutual Learning of Single- and Multi-Channel End-to-End Neural
  Diarization
Mutual Learning of Single- and Multi-Channel End-to-End Neural Diarization
Shota Horiguchi
Yuki Takashima
Shinji Watanabe
Leibny Paola García-Perera
36
2
0
07 Oct 2022
Recent Advances in End-to-End Automatic Speech Recognition
Recent Advances in End-to-End Automatic Speech Recognition
Jinyu Li
VLM
37
363
0
02 Nov 2021
BigSSL: Exploring the Frontier of Large-Scale Semi-Supervised Learning
  for Automatic Speech Recognition
BigSSL: Exploring the Frontier of Large-Scale Semi-Supervised Learning for Automatic Speech Recognition
Yu Zhang
Daniel S. Park
Wei Han
James Qin
Anmol Gulati
...
Zhifeng Chen
Quoc V. Le
Chung-Cheng Chiu
Ruoming Pang
Yonghui Wu
SSL
27
175
0
27 Sep 2021
Collaborative Training of Acoustic Encoders for Speech Recognition
Collaborative Training of Acoustic Encoders for Speech Recognition
Varun K. Nagaraja
Yangyang Shi
Ganesh Venkatesh
Ozlem Kalinli
M. Seltzer
Vikas Chandra
43
11
0
16 Jun 2021
Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and
  Knowledge Distillation
Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation
Gakuto Kurata
Kartik Audhkhasi
16
46
0
17 Apr 2019
1