ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.09590
  4. Cited By
Diversifying the Expert Knowledge for Task-Agnostic Pruning in Sparse Mixture-of-Experts
v1v2v3v4 (latest)

Diversifying the Expert Knowledge for Task-Agnostic Pruning in Sparse Mixture-of-Experts

12 July 2024
Zeliang Zhang
Xiaodong Liu
Hao Cheng
Chenliang Xu
Jianfeng Gao
    MoE
ArXiv (abs)PDFHTML

Papers citing "Diversifying the Expert Knowledge for Task-Agnostic Pruning in Sparse Mixture-of-Experts"

13 / 13 papers shown
Title
Video-R4: Reinforcing Text-Rich Video Reasoning with Visual Rumination
Video-R4: Reinforcing Text-Rich Video Reasoning with Visual Rumination
Y. Tang
Daiki Shimada
Hang Hua
Chao Huang
Jing Bi
Rogerio Feris
Chenliang Xu
225
0
0
21 Nov 2025
DiEP: Adaptive Mixture-of-Experts Compression through Differentiable Expert Pruning
DiEP: Adaptive Mixture-of-Experts Compression through Differentiable Expert Pruning
Sikai Bai
Haoxi Li
Jie Zhang
Zicong Hong
Song Guo
MoE
106
1
0
19 Sep 2025
MoBE: Mixture-of-Basis-Experts for Compressing MoE-based LLMs
MoBE: Mixture-of-Basis-Experts for Compressing MoE-based LLMs
Xiaodong Chen
Mingming Ha
Zhenzhong Lan
Jing Zhang
Jianguo Li
MoE
98
0
0
07 Aug 2025
CAMERA: Multi-Matrix Joint Compression for MoE Models via Micro-Expert Redundancy Analysis
CAMERA: Multi-Matrix Joint Compression for MoE Models via Micro-Expert Redundancy Analysis
Yuzhuang Xu
Xu Han
Yuanchi Zhang
Yixuan Wang
Yijun Liu
Shiyu Ji
Qingfu Zhu
Wanxiang Che
MoEMQ
300
1
0
04 Aug 2025
Unveiling Super Experts in Mixture-of-Experts Large Language Models
Unveiling Super Experts in Mixture-of-Experts Large Language Models
Zunhai Su
Qingyuan Li
Hao Zhang
Weihao Ye
Qibo Xue
YuLei Qian
Yuchen Xie
Ngai Wong
Kehong Yuan
MoE
262
2
0
31 Jul 2025
MoSE: Skill-by-Skill Mixture-of-Experts Learning for Embodied Autonomous Machines
MoSE: Skill-by-Skill Mixture-of-Experts Learning for Embodied Autonomous Machines
Lu Xu
Jiaqian Yu
Xiongfeng Peng
Yiwei Chen
W. Li
J. Yoo
Sunghyun Chunag
Dongwook Lee
Daehyun Ji
Chao Zhang
MoE
174
0
0
10 Jul 2025
Unveiling Hidden Collaboration within Mixture-of-Experts in Large Language Models
Unveiling Hidden Collaboration within Mixture-of-Experts in Large Language Models
Yuanbo Tang
Yan Tang
N. Zhang
Meixuan Chen
Yang Li
MoE
400
1
0
16 Apr 2025
Cluster-Driven Expert Pruning for Mixture-of-Experts Large Language Models
Cluster-Driven Expert Pruning for Mixture-of-Experts Large Language Models
Hongcheng Guo
Juntao Yao
Boyang Wang
Junjia Du
Shaosheng Cao
Donglin Di
Shun Zhang
Hui Yuan
MoE
223
3
0
10 Apr 2025
Model Assembly Learning with Heterogeneous Layer Weight Merging
Model Assembly Learning with Heterogeneous Layer Weight Merging
Yi-Kai Zhang
Jin Wang
Xu-Xiang Zhong
De-Chuan Zhan
Han-Jia Ye
MoMe
282
1
0
27 Mar 2025
CalibQuant: 1-Bit KV Cache Quantization for Multimodal LLMs
CalibQuant: 1-Bit KV Cache Quantization for Multimodal LLMs
Zeliang Zhang
Yifan Zhu
Susan Liang
Zhiyuan Wang
Jiani Liu
...
Mingjie Zhao
Chenliang Xu
Kun Wan
Wentian Zhao
Wentian Zhao
VLMMQ
339
0
0
15 Feb 2025
MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the
  Hints from Its Router
MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router
Yanyue Xie
Zhi Zhang
Ding Zhou
Cong Xie
Ziang Song
Xin Liu
Yanzhi Wang
Xue Lin
An Xu
LLMAG
206
24
0
15 Oct 2024
STUN: Structured-Then-Unstructured Pruning for Scalable MoE Pruning
STUN: Structured-Then-Unstructured Pruning for Scalable MoE PruningAnnual Meeting of the Association for Computational Linguistics (ACL), 2024
Jaeseong Lee
Seung-won Hwang
Aurick Qiao
Daniel F Campos
Z. Yao
Yuxiong He
254
10
0
10 Sep 2024
A Closer Look into Mixture-of-Experts in Large Language Models
A Closer Look into Mixture-of-Experts in Large Language Models
Ka Man Lo
Zeyu Huang
Zihan Qiu
Zili Wang
Jie Fu
MoE
410
23
0
26 Jun 2024
1