ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.12418
  4. Cited By
Unsupervised pre-training for sequence to sequence speech recognition
v1v2 (latest)

Unsupervised pre-training for sequence to sequence speech recognition

28 October 2019
Zhiyun Fan
Shiyu Zhou
Bo Xu
    SSLAI4TS
ArXiv (abs)PDFHTML

Papers citing "Unsupervised pre-training for sequence to sequence speech recognition"

9 / 9 papers shown
Title
Sequence-to-sequence models in peer-to-peer learning: A practical
  application
Sequence-to-sequence models in peer-to-peer learning: A practical application
Robert Šajina
Ivo Ipšić
93
0
0
02 May 2024
A Complementary Joint Training Approach Using Unpaired Speech and Text
  for Low-Resource Automatic Speech Recognition
A Complementary Joint Training Approach Using Unpaired Speech and Text for Low-Resource Automatic Speech Recognition
Ye Du
Jie Zhang
Qiu-shi Zhu
Lirong Dai
Ming Wu
Xin Fang
Zhouwang Yang
51
2
0
05 Apr 2022
Pre-Training Transformer Decoder for End-to-End ASR Model with Unpaired
  Speech Data
Pre-Training Transformer Decoder for End-to-End ASR Model with Unpaired Speech Data
Junyi Ao
Zi-Hua Zhang
Long Zhou
Shujie Liu
Haizhou Li
Tom Ko
Lirong Dai
Jinyu Li
Yao Qian
Furu Wei
SSL
77
19
0
31 Mar 2022
Pretrained Language Models for Text Generation: A Survey
Pretrained Language Models for Text Generation: A Survey
Junyi Li
Tianyi Tang
Wayne Xin Zhao
J. Nie
Ji-Rong Wen
AI4CE
170
151
0
14 Jan 2022
Dropout Regularization for Self-Supervised Learning of Transformer
  Encoder Speech Representation
Dropout Regularization for Self-Supervised Learning of Transformer Encoder Speech Representation
Jian Luo
Jianzong Wang
Ning Cheng
Jing Xiao
SSL
66
6
0
09 Jul 2021
Pretrained Language Models for Text Generation: A Survey
Pretrained Language Models for Text Generation: A Survey
Junyi Li
Tianyi Tang
Wayne Xin Zhao
Ji-Rong Wen
LM&MAVLMSyDa
106
191
0
21 May 2021
Non-autoregressive Transformer-based End-to-end ASR using BERT
Non-autoregressive Transformer-based End-to-end ASR using BERT
Fu-Hao Yu
Kuan-Yu Chen
55
23
0
10 Apr 2021
Listen Attentively, and Spell Once: Whole Sentence Generation via a
  Non-Autoregressive Architecture for Low-Latency Speech Recognition
Listen Attentively, and Spell Once: Whole Sentence Generation via a Non-Autoregressive Architecture for Low-Latency Speech Recognition
Ye Bai
Jiangyan Yi
J. Tao
Zhengkun Tian
Zhengqi Wen
Shuai Zhang
RALM
63
41
0
11 May 2020
Listen and Fill in the Missing Letters: Non-Autoregressive Transformer
  for Speech Recognition
Listen and Fill in the Missing Letters: Non-Autoregressive Transformer for Speech Recognition
Nanxin Chen
Shinji Watanabe
Jesús Villalba
Najim Dehak
70
16
0
10 Nov 2019
1