Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2503.00634
Cited By
Efficiently Editing Mixture-of-Experts Models with Compressed Experts
1 March 2025
Y. He
Yang Liu
Chen Liang
Hany Awadalla
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Efficiently Editing Mixture-of-Experts Models with Compressed Experts"
1 / 1 papers shown
Title
BadMoE: Backdooring Mixture-of-Experts LLMs via Optimizing Routing Triggers and Infecting Dormant Experts
Qingyue Wang
Qi Pang
Xixun Lin
Shuai Wang
Daoyuan Wu
MoE
54
0
0
24 Apr 2025
1