ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.16055
31
0

SALT: Singular Value Adaptation with Low-Rank Transformation

20 March 2025
Abdelrahman Elsayed
Sarim Hashmi
Mohammed Elseiagy
Hu Wang
Mohammad Yaqub
Ibrahim Almakky
    OOD
ArXivPDFHTML
Abstract

The complex nature of medical image segmentation calls for models that are specifically designed to capture detailed, domain-specific features. Large foundation models offer considerable flexibility, yet the cost of fine-tuning these models remains a significant barrier. Parameter-Efficient Fine-Tuning (PEFT) methods, such as Low-Rank Adaptation (LoRA), efficiently update model weights with low-rank matrices but may suffer from underfitting when the chosen rank is insufficient to capture domain-specific nuances. Conversely, full-rank Singular Value Decomposition (SVD) based methods provide comprehensive updates by modifying all singular values, yet they often lack flexibility and exhibit variable performance across datasets. We propose SALT (Singular Value Adaptation with Low-Rank Transformation), a method that selectively adapts the most influential singular values using trainable scale and shift parameters while complementing this with a low-rank update for the remaining subspace. This hybrid approach harnesses the advantages of both LoRA and SVD, enabling effective adaptation without relying on increasing model size or depth. Evaluated on 5 challenging medical datasets, ranging from as few as 20 samples to 1000, SALT outperforms state-of-the-art PEFT (LoRA and SVD) by 2% to 5% in Dice with only 3.9% trainable parameters, demonstrating robust adaptation even in low-resource settings. The code for SALT is available at:this https URL

View on arXiv
@article{elsayed2025_2503.16055,
  title={ SALT: Singular Value Adaptation with Low-Rank Transformation },
  author={ Abdelrahman Elsayed and Sarim Hashmi and Mohammed Elseiagy and Hu Wang and Mohammad Yaqub and Ibrahim Almakky },
  journal={arXiv preprint arXiv:2503.16055},
  year={ 2025 }
}
Comments on this paper