Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.11831
Cited By
SpeechMoE2: Mixture-of-Experts Model with Improved Routing
23 November 2021
Zhao You
Shulin Feng
Dan Su
Dong Yu
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"SpeechMoE2: Mixture-of-Experts Model with Improved Routing"
5 / 5 papers shown
Title
LUPET: Incorporating Hierarchical Information Path into Multilingual ASR
Wei Liu
Jingyong Hou
Dong Yang
Muyong Cao
Tan Lee
70
1
0
10 Jan 2025
MoE-RBench
\texttt{MoE-RBench}
MoE-RBench
: Towards Building Reliable Language Models with Sparse Mixture-of-Experts
Guanjie Chen
Xinyu Zhao
Tianlong Chen
Yu Cheng
MoE
62
5
0
17 Jun 2024
A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts
Huy Nguyen
Pedram Akbarian
TrungTin Nguyen
Nhat Ho
23
10
0
22 Oct 2023
Learning When to Trust Which Teacher for Weakly Supervised ASR
Aakriti Agrawal
Milind Rao
Anit Kumar Sahu
Gopinath Chennupati
A. Stolcke
14
0
0
21 Jun 2023
3M: Multi-loss, Multi-path and Multi-level Neural Networks for speech recognition
Zhao You
Shulin Feng
Dan Su
Dong Yu
8
9
0
07 Apr 2022
1