Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2409.06669
Cited By
DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models
10 September 2024
Maryam Akhavan Aghdam
Hongpeng Jin
Yanzhao Wu
MoE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models"
3 / 3 papers shown
Title
AnyExperts: On-Demand Expert Allocation for Multimodal Language Models with Mixture of Expert
Yuting Gao
Wang Lan
Hengyuan Zhao
Linjiang Huang
Si Liu
Q. Guo
MoE
80
0
0
23 Nov 2025
Dynamic Reasoning Chains through Depth-Specialized Mixture-of-Experts in Transformer Architectures
Sampurna Roy
Ayan Sar
Anurag Kaushish
Kanav Gupta
Tanupriya Choudhury
Abhijit Kumar
MoE
LRM
80
0
0
24 Sep 2025
MoQa: Rethinking MoE Quantization with Multi-stage Data-model Distribution Awareness
Zihao Zheng
Xiuping Cui
Size Zheng
Maoliang Li
Jiayu Chen
Yun Liang
Xiang Chen
MQ
MoE
232
1
0
27 Mar 2025
1