ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11883
14
0

MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging

17 May 2025
Zihuan Qiu
Yi Xu
Chiyuan He
Fanman Meng
Linfeng Xu
Qi Wu
Hongliang Li
    CLL
    MoMe
ArXivPDFHTML
Abstract

Continual model merging integrates independently fine-tuned models sequentially without access to original training data, providing a scalable and efficient solution to continual learning. However, current methods still face critical challenges, notably parameter interference among tasks and limited adaptability to evolving test distributions. The former causes catastrophic forgetting of integrated tasks, while the latter hinders effective adaptation to new tasks. To address these, we propose MINGLE, a novel framework for test-time continual model merging, which leverages test-time adaptation using a small set of unlabeled test samples from the current task to dynamically guide the merging process. MINGLE employs a mixture-of-experts architecture composed of parameter-efficient, low-rank experts, enabling efficient adaptation and improving robustness to distribution shifts. To mitigate catastrophic forgetting, we propose Null-Space Constrained Gating, which restricts gating updates to subspaces orthogonal to prior task representations. This suppresses activations on old task inputs and preserves model behavior on past tasks. To further balance stability and adaptability, we design an Adaptive Relaxation Strategy, which dynamically adjusts the constraint strength based on interference signals captured during test-time adaptation. Extensive experiments on standard continual merging benchmarks demonstrate that MINGLE achieves robust generalization, reduces forgetting significantly, and consistently surpasses previous state-of-the-art methods by 7-9\% on average across diverse task orders.

View on arXiv
@article{qiu2025_2505.11883,
  title={ MINGLE: Mixtures of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging },
  author={ Zihuan Qiu and Yi Xu and Chiyuan He and Fanman Meng and Linfeng Xu and Qingbo Wu and Hongliang Li },
  journal={arXiv preprint arXiv:2505.11883},
  year={ 2025 }
}
Comments on this paper