ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.07348
  4. Cited By
MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation
  Experts

MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts

9 October 2024
Peng Jin
Bo Zhu
Li Yuan
Shuicheng Yan
    MoE
ArXivPDFHTML

Papers citing "MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts"

1 / 1 papers shown
Title
DSMoE: Matrix-Partitioned Experts with Dynamic Routing for Computation-Efficient Dense LLMs
Minxuan Lv
Zhenpeng Su
Leiyu Pan
Yizhe Xiong
Zijia Lin
...
Guiguang Ding
Cheng Luo
Di Zhang
Kun Gai
Songlin Hu
MoE
36
0
0
18 Feb 2025
1