ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2507.20999
  4. Cited By
LoRA-PAR: A Flexible Dual-System LoRA Partitioning Approach to Efficient LLM Fine-Tuning
v1v2v3 (latest)

LoRA-PAR: A Flexible Dual-System LoRA Partitioning Approach to Efficient LLM Fine-Tuning

28 July 2025
Yining Huang
Bin Li
Keke Tang
Meilian Chen
    MoELRM
ArXiv (abs)PDFHTMLGithub (1492★)

Papers citing "LoRA-PAR: A Flexible Dual-System LoRA Partitioning Approach to Efficient LLM Fine-Tuning"

1 / 1 papers shown
Title
How Is LLM Reasoning Distracted by Irrelevant Context? An Analysis Using a Controlled Benchmark
How Is LLM Reasoning Distracted by Irrelevant Context? An Analysis Using a Controlled Benchmark
Minglai Yang
Ethan Huang
Liang Zhang
Mihai Surdeanu
William Yang Wang
Liangming Pan
LRM
125
10
0
24 May 2025
1