ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.19086
  4. Cited By
MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors
v1v2 (latest)

MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors

29 May 2024
Renzhi Wang
Piji Li
    KELM
ArXiv (abs)PDFHTMLHuggingFace (1 upvotes)Github (2752★)

Papers citing "MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors"

2 / 2 papers shown
One for All: Update Parameterized Knowledge Across Multiple Models
One for All: Update Parameterized Knowledge Across Multiple ModelsAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Weitao Ma
Xiyuan Du
Xiaocheng Feng
Daigang Xu
Yichong Huang
...
Xiaoliang Yang
Baohang Li
Xiachong Feng
Ting Liu
Bing Qin
KELM
196
0
0
01 Jun 2025
LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of
  Large Language Models
LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models
Renzhi Wang
Piji Li
KELMCLL
316
16
0
28 Jun 2024
1
Page 1 of 1