ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.12783
  4. Cited By
Stable Distillation: Regularizing Continued Pre-training for
  Low-Resource Automatic Speech Recognition

Stable Distillation: Regularizing Continued Pre-training for Low-Resource Automatic Speech Recognition

20 December 2023
Ashish Seth
Sreyan Ghosh
S. Umesh
Dinesh Manocha
    CLL
ArXivPDFHTML

Papers citing "Stable Distillation: Regularizing Continued Pre-training for Low-Resource Automatic Speech Recognition"

2 / 2 papers shown
Title
Magic dust for cross-lingual adaptation of monolingual wav2vec-2.0
Magic dust for cross-lingual adaptation of monolingual wav2vec-2.0
Sameer Khurana
Antoine Laurent
James R. Glass
VLM
35
18
0
07 Oct 2021
CLSRIL-23: Cross Lingual Speech Representations for Indic Languages
CLSRIL-23: Cross Lingual Speech Representations for Indic Languages
Anirudh Gupta
Harveen Singh Chadha
Priyanshi Shah
Neeraj Chimmwal
Ankur Dhuriya
Rishabh Gaur
Vivek Raghavan
24
37
0
15 Jul 2021
1