Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2505.22323
Cited By
Advancing Expert Specialization for Better MoE
28 May 2025
Hongcan Guo
Haolang Lu
Guoshun Nan
Bolun Chu
Jialin Zhuang
Yuan Yang
Wenhao Che
Sicong Leng
Qimei Cui
Xudong Jiang
MoE
MoMe
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Advancing Expert Specialization for Better MoE"
2 / 2 papers shown
Title
Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Yongxin Guo
Zhenglin Cheng
Xiaoying Tang
Tao R. Lin
Tao Lin
MoE
198
9
0
23 May 2024
Shortcut-connected Expert Parallelism for Accelerating Mixture-of-Experts
Weilin Cai
Juyong Jiang
Le Qin
Junwei Cui
Sunghun Kim
Jiayi Huang
180
10
0
07 Apr 2024
1