Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.17263
Cited By
MELoRA: Mini-Ensemble Low-Rank Adapters for Parameter-Efficient Fine-Tuning
27 February 2024
Pengjie Ren
Chengshun Shi
Shiguang Wu
Mengqi Zhang
Zhaochun Ren
Maarten de Rijke
Zhumin Chen
Jiahuan Pei
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MELoRA: Mini-Ensemble Low-Rank Adapters for Parameter-Efficient Fine-Tuning"
6 / 6 papers shown
Title
K-LoRA: Unlocking Training-Free Fusion of Any Subject and Style LoRAs
Ziheng Ouyang
Zhen Li
Qibin Hou
MoMe
OffRL
95
2
0
25 Feb 2025
GoRA: Gradient-driven Adaptive Low Rank Adaptation
Haonan He
Peng Ye
Yuchen Ren
Yuan Yuan
Lei Chen
AI4TS
AI4CE
105
0
0
13 Feb 2025
Chain of LoRA: Efficient Fine-tuning of Language Models via Residual Learning
Wenhan Xia
Chengwei Qin
Elad Hazan
46
52
0
08 Jan 2024
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,835
0
18 Apr 2021
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
226
4,424
0
23 Jan 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1