Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2405.19086
Cited By
v1
v2 (latest)
MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors
29 May 2024
Renzhi Wang
Piji Li
KELM
Re-assign community
ArXiv (abs)
PDF
HTML
HuggingFace (1 upvotes)
Github (2752★)
Papers citing
"MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors"
2 / 2 papers shown
One for All: Update Parameterized Knowledge Across Multiple Models
Annual Meeting of the Association for Computational Linguistics (ACL), 2025
Weitao Ma
Xiyuan Du
Xiaocheng Feng
Daigang Xu
Yichong Huang
...
Xiaoliang Yang
Baohang Li
Xiachong Feng
Ting Liu
Bing Qin
KELM
196
0
0
01 Jun 2025
LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models
Renzhi Wang
Piji Li
KELM
CLL
316
16
0
28 Jun 2024
1
Page 1 of 1