ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.10902
  4. Cited By
Leveraging Submodule Linearity Enhances Task Arithmetic Performance in LLMs

Leveraging Submodule Linearity Enhances Task Arithmetic Performance in LLMs

15 April 2025
Rui Dai
Sile Hu
Xu Shen
Yonggang Zhang
Xinmei Tian
Jieping Ye
    MoMe
ArXivPDFHTML

Papers citing "Leveraging Submodule Linearity Enhances Task Arithmetic Performance in LLMs"

2 / 2 papers shown
Title
Layer-Aware Task Arithmetic: Disentangling Task-Specific and Instruction-Following Knowledge
Layer-Aware Task Arithmetic: Disentangling Task-Specific and Instruction-Following Knowledge
Yan-Lun Chen
Yi-Ru Wei
Chia-Yi Hsu
Chia-Mu Yu
Chun-ying Huang
Ying-Dar Lin
Yu-Sung Wu
Wei-Bin Lee
MoMe
KELM
48
0
0
27 Feb 2025
Scalable Model Merging with Progressive Layer-wise Distillation
Scalable Model Merging with Progressive Layer-wise Distillation
Jing Xu
Jiazheng Li
J. Zhang
MoMe
FedML
79
0
0
18 Feb 2025
1