ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.07440
  4. Cited By
Matrix-Transformation Based Low-Rank Adaptation (MTLoRA): A
  Brain-Inspired Method for Parameter-Efficient Fine-Tuning
v1v2v3 (latest)

Matrix-Transformation Based Low-Rank Adaptation (MTLoRA): A Brain-Inspired Method for Parameter-Efficient Fine-Tuning

12 March 2024
Yao Liang
Yuwei Wang
Yang Li
Yi Zeng
ArXiv (abs)PDFHTML

Papers citing "Matrix-Transformation Based Low-Rank Adaptation (MTLoRA): A Brain-Inspired Method for Parameter-Efficient Fine-Tuning"

1 / 1 papers shown
Title
Two Is Better Than One: Rotations Scale LoRAs
Two Is Better Than One: Rotations Scale LoRAs
Hongcan Guo
Guoshun Nan
Yuan Yang
Diyang Zhang
Haotian Li
...
Yuhan Ran
Xinye Cao
Sicong Leng
Xiaofeng Tao
Xudong Jiang
208
0
0
29 May 2025
1