60
0

Dynamic Task Vector Grouping for Efficient Multi-Task Prompt Tuning

Abstract

Multi-task prompt tuning utilizes multiple high-resource source tasks to improve performance on low-source target tasks. Existing approaches transfer the soft prompt trained by combining all source tasks or a single ``high-similar'' source task one-time-only. However, we find that the optimal transfer performance often comes from a combination of source tasks, which is neither one nor all. Further, we find that the similarity between source and target tasks also changes dynamically during fine-tuning after transfering, making similarity calculation in the initiation stage inadequate. To address these issues, we propose a method called Dynamic Task Vector Grouping (DTVG), whose core ideas contain (1) measuring the task similarity with task vectors instead of soft prompt, (2) grouping the optimal source task combination based on two metrics: {\it target similarity} and {\it knowledge consistency}; (3) dynamically updating the combination in each iteration step. Extensive experiments on the 26 NLP datasets under different settings demonstrate that DTVG effectively groups similar source tasks while reducing negative transfer, achieving the start-of-art performance.

View on arXiv
@article{zhang2025_2503.18063,
  title={ Dynamic Task Vector Grouping for Efficient Multi-Task Prompt Tuning },
  author={ Pieyi Zhang and Richong Zhang and Zhijie Nie },
  journal={arXiv preprint arXiv:2503.18063},
  year={ 2025 }
}
Comments on this paper