ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.19086
  4. Cited By
MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors

MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors

29 May 2024
Renzhi Wang
Piji Li
    KELM
ArXivPDFHTML

Papers citing "MEMoE: Enhancing Model Editing with Mixture of Experts Adaptors"

5 / 5 papers shown
Title
Consecutive Model Editing with Batch alongside HooK Layers
Consecutive Model Editing with Batch alongside HooK Layers
Shuaiyi Li
Yang Deng
Deng Cai
Hongyuan Lu
Liang Chen
Wai Lam
KELM
43
8
0
08 Mar 2024
Calibrating Factual Knowledge in Pretrained Language Models
Calibrating Factual Knowledge in Pretrained Language Models
Qingxiu Dong
Damai Dai
Yifan Song
Jingjing Xu
Zhifang Sui
Lei Li
KELM
213
81
0
07 Oct 2022
Mixture-of-Experts with Expert Choice Routing
Mixture-of-Experts with Expert Choice Routing
Yan-Quan Zhou
Tao Lei
Han-Chu Liu
Nan Du
Yanping Huang
Vincent Zhao
Andrew M. Dai
Zhifeng Chen
Quoc V. Le
James Laudon
MoE
137
323
0
18 Feb 2022
Fast Model Editing at Scale
Fast Model Editing at Scale
E. Mitchell
Charles Lin
Antoine Bosselut
Chelsea Finn
Christopher D. Manning
KELM
217
254
0
21 Oct 2021
Language Models as Knowledge Bases?
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
391
2,216
0
03 Sep 2019
1