Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2501.10062
Cited By
v1
v2 (latest)
OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning
17 January 2025
Jinyuan Feng
Zhiqiang Pu
Tianyi Hu
Dongmin Li
Xiaolin Ai
Huimu Wang
MoE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"OMoE: Diversifying Mixture of Low-Rank Adaptation by Orthogonal Finetuning"
8 / 8 papers shown
FreeFuse: Multi-Subject LoRA Fusion via Auto Masking at Test Time
Yaoli Liu
Yao-Xiang Ding
Kun Zhou
192
0
0
27 Oct 2025
OrthAlign: Orthogonal Subspace Decomposition for Non-Interfering Multi-Objective Alignment
Guanbin Li
Zhihao Xu
Junhao Dong
Jian Zhao
Yuchen Yuan
...
Zhengtao Yao
Huahui Yi
Dongrui Liu
Xinfeng Li
Kun Wang
239
1
0
29 Sep 2025
Orthogonal Finetuning Made Scalable
Zeju Qiu
Weiyang Liu
Adrian Weller
Bernhard Schölkopf
221
1
0
24 Jun 2025
MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation
Shen Yuan
Yin Zheng
Taifeng Wang
Binbin Liu
Hongteng Xu
MoMe
388
1
0
17 Jun 2025
Two Is Better Than One: Rotations Scale LoRAs
Hongcan Guo
Guoshun Nan
Yuan Yang
Diyang Zhang
Haotian Li
...
Yuhan Ran
Xinye Cao
Sicong Leng
Xiaofeng Tao
Xudong Jiang
246
0
0
29 May 2025
CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning
Jinyuan Feng
Chaopeng Wei
Tenghai Qiu
Tianyi Hu
Zhiqiang Pu
MoE
326
0
0
23 May 2025
FT-MoE: Sustainable-learning Mixture of Experts for Fault-Tolerant Computing
Wenjing Xiao
Wenhao Song
Miaojiang Chen
Ruikun Luo
MoE
1.1K
0
0
29 Apr 2025
Mixture of Group Experts for Learning Invariant Representations
Lei Kang
Jia Li
Mi Tian
Hua Huang
MoE
365
0
0
12 Apr 2025
1
Page 1 of 1