ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.02932
  4. Cited By
PLoRA: Efficient LoRA Hyperparameter Tuning for Large Models

PLoRA: Efficient LoRA Hyperparameter Tuning for Large Models

4 August 2025
Minghao Yan
Zhuang Wang
Zhen Jia
Shivaram Venkataraman
Yida Wang
ArXiv (abs)PDFHTMLGithub (45286★)

Papers citing "PLoRA: Efficient LoRA Hyperparameter Tuning for Large Models"

1 / 1 papers shown
NanoFlux: Adversarial Dual-LLM Evaluation and Distillation For Multi-Domain Reasoning
NanoFlux: Adversarial Dual-LLM Evaluation and Distillation For Multi-Domain Reasoning
Raviteja Anantha
Soheil Hor
Teodor Nicola Antoniu
Layne C. Price
AAMLLRM
170
0
0
27 Sep 2025
1
Page 1 of 1