ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.06669
  4. Cited By
DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models

DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models

10 September 2024
Maryam Akhavan Aghdam
Hongpeng Jin
Yanzhao Wu
    MoE
ArXiv (abs)PDFHTML

Papers citing "DA-MoE: Towards Dynamic Expert Allocation for Mixture-of-Experts Models"

3 / 3 papers shown
Title
AnyExperts: On-Demand Expert Allocation for Multimodal Language Models with Mixture of Expert
AnyExperts: On-Demand Expert Allocation for Multimodal Language Models with Mixture of Expert
Yuting Gao
Wang Lan
Hengyuan Zhao
Linjiang Huang
Si Liu
Q. Guo
MoE
80
0
0
23 Nov 2025
Dynamic Reasoning Chains through Depth-Specialized Mixture-of-Experts in Transformer Architectures
Dynamic Reasoning Chains through Depth-Specialized Mixture-of-Experts in Transformer Architectures
Sampurna Roy
Ayan Sar
Anurag Kaushish
Kanav Gupta
Tanupriya Choudhury
Abhijit Kumar
MoELRM
80
0
0
24 Sep 2025
MoQa: Rethinking MoE Quantization with Multi-stage Data-model Distribution Awareness
MoQa: Rethinking MoE Quantization with Multi-stage Data-model Distribution Awareness
Zihao Zheng
Xiuping Cui
Size Zheng
Maoliang Li
Jiayu Chen
Yun Liang
Xiang Chen
MQMoE
232
1
0
27 Mar 2025
1