Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.01070
Cited By
Multilingual DistilWhisper: Efficient Distillation of Multi-task Speech Models via Language-Specific Experts
2 November 2023
Thomas Palmeira Ferraz
Marcely Zanon Boito
Caroline Brun
Vassilina Nikoulina
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multilingual DistilWhisper: Efficient Distillation of Multi-task Speech Models via Language-Specific Experts"
7 / 7 papers shown
Title
Outlier Reduction with Gated Attention for Improved Post-training Quantization in Large Sequence-to-sequence Speech Foundation Models
Dominik Wagner
Ilja Baumann
K. Riedhammer
Tobias Bocklet
MQ
22
1
0
16 Jun 2024
Scaling Speech Technology to 1,000+ Languages
Vineel Pratap
Andros Tjandra
Bowen Shi
Paden Tomasello
Arun Babu
...
Yossi Adi
Xiaohui Zhang
Wei-Ning Hsu
Alexis Conneau
Michael Auli
VLM
73
297
0
22 May 2023
Google USM: Scaling Automatic Speech Recognition Beyond 100 Languages
Yu Zhang
Wei Han
James Qin
Yongqiang Wang
Ankur Bapna
...
Pedro J. Moreno
Chung-Cheng Chiu
J. Schalkwyk
Franccoise Beaufays
Yonghui Wu
VLM
77
249
0
02 Mar 2023
Language-Universal Adapter Learning with Knowledge Distillation for End-to-End Multilingual Speech Recognition
Zhijie Shen
Wu Guo
Bin Gu
33
4
0
28 Feb 2023
FLEURS: Few-shot Learning Evaluation of Universal Representations of Speech
Alexis Conneau
Min Ma
Simran Khanuja
Yu Zhang
Vera Axelrod
Siddharth Dalmia
Jason Riesa
Clara E. Rivera
Ankur Bapna
VLM
78
281
0
25 May 2022
Residual Adapters for Parameter-Efficient ASR Adaptation to Atypical and Accented Speech
Katrin Tomanek
Vicky Zayats
Dirk Padfield
K. Vaillancourt
Fadi Biadsy
51
57
0
14 Sep 2021
Larger-Scale Transformers for Multilingual Masked Language Modeling
Naman Goyal
Jingfei Du
Myle Ott
Giridhar Anantharaman
Alexis Conneau
88
98
0
02 May 2021
1