ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.04712
  4. Cited By
CTC-synchronous Training for Monotonic Attention Model
v1v2v3 (latest)

CTC-synchronous Training for Monotonic Attention Model

10 May 2020
Hirofumi Inaguma
Masato Mimura
Tatsuya Kawahara
ArXiv (abs)PDFHTML

Papers citing "CTC-synchronous Training for Monotonic Attention Model"

4 / 4 papers shown
VAD-free Streaming Hybrid CTC/Attention ASR for Unsegmented Recording
VAD-free Streaming Hybrid CTC/Attention ASR for Unsegmented RecordingInterspeech (Interspeech), 2021
Hirofumi Inaguma
Tatsuya Kawahara
304
2
0
15 Jul 2021
StableEmit: Selection Probability Discount for Reducing Emission Latency
  of Streaming Monotonic Attention ASR
StableEmit: Selection Probability Discount for Reducing Emission Latency of Streaming Monotonic Attention ASR
Hirofumi Inaguma
Tatsuya Kawahara
217
4
0
01 Jul 2021
Alignment Knowledge Distillation for Online Streaming Attention-based
  Speech Recognition
Alignment Knowledge Distillation for Online Streaming Attention-based Speech RecognitionIEEE/ACM Transactions on Audio Speech and Language Processing (TASLP), 2021
Hirofumi Inaguma
Tatsuya Kawahara
407
19
0
28 Feb 2021
Enhancing Monotonic Multihead Attention for Streaming ASR
Enhancing Monotonic Multihead Attention for Streaming ASR
Hirofumi Inaguma
Masato Mimura
Tatsuya Kawahara
426
37
0
19 May 2020
1
Page 1 of 1