Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2312.06173
Cited By
Concrete Subspace Learning based Interference Elimination for Multi-task Model Fusion
11 December 2023
A. Tang
Li Shen
Yong Luo
Liang Ding
Han Hu
Bo Du
Dacheng Tao
MoMe
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Concrete Subspace Learning based Interference Elimination for Multi-task Model Fusion"
7 / 7 papers shown
Title
Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace
Jinluan Yang
A. Tang
Didi Zhu
Zhengyu Chen
Li Shen
Fei Wu
MoMe
AAML
52
3
0
17 Oct 2024
Merge, Ensemble, and Cooperate! A Survey on Collaborative Strategies in the Era of Large Language Models
Jinliang Lu
Ziliang Pang
Min Xiao
Yaochen Zhu
Rui Xia
Jiajun Zhang
MoMe
29
18
0
08 Jul 2024
π
π
π
-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation
Chengyue Wu
Teng Wang
Yixiao Ge
Zeyu Lu
Rui-Zhi Zhou
Ying Shan
Ping Luo
MoMe
80
35
0
27 Apr 2023
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
312
0
11 Sep 2022
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,412
0
11 Nov 2021
Optimizing Mode Connectivity via Neuron Alignment
N. Joseph Tatro
Pin-Yu Chen
Payel Das
Igor Melnyk
P. Sattigeri
Rongjie Lai
MoMe
223
80
0
05 Sep 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1