ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.12763
  4. Cited By
Reduce, Reuse, Recycle: Is Perturbed Data better than Other Language
  augmentation for Low Resource Self-Supervised Speech Models

Reduce, Reuse, Recycle: Is Perturbed Data better than Other Language augmentation for Low Resource Self-Supervised Speech Models

22 September 2023
Asad Ullah
Alessandro Ragano
Andrew Hines
ArXivPDFHTML

Papers citing "Reduce, Reuse, Recycle: Is Perturbed Data better than Other Language augmentation for Low Resource Self-Supervised Speech Models"

1 / 1 papers shown
Title
A Noise-Robust Self-supervised Pre-training Model Based Speech
  Representation Learning for Automatic Speech Recognition
A Noise-Robust Self-supervised Pre-training Model Based Speech Representation Learning for Automatic Speech Recognition
Qiu-shi Zhu
Jie M. Zhang
Zi-qiang Zhang
Ming Wu
Xin Fang
Lirong Dai
117
39
0
22 Jan 2022
1