ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.11909
39
19

Mixture-of-Subspaces in Low-Rank Adaptation

16 June 2024
Taiqiang Wu
Jiahao Wang
Zhe Zhao
Ngai Wong
ArXivPDFHTML
Abstract

In this paper, we introduce a subspace-inspired Low-Rank Adaptation (LoRA) method, which is computationally efficient, easy to implement, and readily applicable to large language, multimodal, and diffusion models. Initially, we equivalently decompose the weights of LoRA into two subspaces, and find that simply mixing them can enhance performance. To study such a phenomenon, we revisit it through a fine-grained subspace lens, showing that such modification is equivalent to employing a fixed mixer to fuse the subspaces. To be more flexible, we jointly learn the mixer with the original LoRA weights, and term the method Mixture-of-Subspaces LoRA (MoSLoRA). MoSLoRA consistently outperforms LoRA on tasks in different modalities, including commonsense reasoning, visual instruction tuning, and subject-driven text-to-image generation, demonstrating its effectiveness and robustness. Codes are available atthis https URL.

View on arXiv
@article{wu2025_2406.11909,
  title={ Mixture-of-Subspaces in Low-Rank Adaptation },
  author={ Taiqiang Wu and Jiahao Wang and Zhe Zhao and Ngai Wong },
  journal={arXiv preprint arXiv:2406.11909},
  year={ 2025 }
}
Comments on this paper