ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01463
22
8

Selective Aggregation for Low-Rank Adaptation in Federated Learning

2 October 2024
Pengxin Guo
Shuang Zeng
Y. Wang
Huijie Fan
Feifei Wang
Liangqiong Qu
    FedML
ArXivPDFHTML
Abstract

We investigate LoRA in federated learning through the lens of the asymmetry analysis of the learned AAA and BBB matrices. In doing so, we uncover that AAA matrices are responsible for learning general knowledge, while BBB matrices focus on capturing client-specific knowledge. Based on this finding, we introduce Federated Share-A Low-Rank Adaptation (FedSA-LoRA), which employs two low-rank trainable matrices AAA and BBB to model the weight update, but only AAA matrices are shared with the server for aggregation. Moreover, we delve into the relationship between the learned AAA and BBB matrices in other LoRA variants, such as rsLoRA and VeRA, revealing a consistent pattern. Consequently, we extend our FedSA-LoRA method to these LoRA variants, resulting in FedSA-rsLoRA and FedSA-VeRA. In this way, we establish a general paradigm for integrating LoRA with FL, offering guidance for future work on subsequent LoRA variants combined with FL. Extensive experimental results on natural language understanding and generation tasks demonstrate the effectiveness of the proposed method. Our code is available atthis https URL.

View on arXiv
@article{guo2025_2410.01463,
  title={ Selective Aggregation for Low-Rank Adaptation in Federated Learning },
  author={ Pengxin Guo and Shuang Zeng and Yanran Wang and Huijie Fan and Feifei Wang and Liangqiong Qu },
  journal={arXiv preprint arXiv:2410.01463},
  year={ 2025 }
}
Comments on this paper