19
0

CAT Merging: A Training-Free Approach for Resolving Conflicts in Model Merging

Abstract

Multi-task model merging offers a promising paradigm for integrating multiple expert models into a unified model without additional training. Existing state-of-the-art techniques, such as Task Arithmetic and its variants, merge models by accumulating task vectors -- the parameter differences between pretrained and finetuned models. However, task vector accumulation is often hindered by knowledge conflicts, leading to performance degradation. To address this challenge, we propose Conflict-Aware Task Merging (CAT Merging), a novel training-free framework that selectively trims conflict-prone components from the task vectors. CAT Merging introduces several parameter-specific strategies, including projection for linear weights and masking for scaling and shifting parameters in normalization layers. Extensive experiments on vision, language, and vision-language tasks demonstrate that CAT Merging effectively suppresses knowledge conflicts, achieving average accuracy improvements of up to 2.5% (ViT-B/32) and 2.0% (ViT-L/14) over state-of-the-art methods.

View on arXiv
@article{sun2025_2505.06977,
  title={ CAT Merging: A Training-Free Approach for Resolving Conflicts in Model Merging },
  author={ Wenju Sun and Qingyong Li and Yangli-ao Geng and Boyang Li },
  journal={arXiv preprint arXiv:2505.06977},
  year={ 2025 }
}
Comments on this paper