MT2ST: Adaptive Multi-Task to Single-Task Learning

Abstract
Efficient machine learning (ML) has become increasingly important as models grow larger and data volumes expand. In this work, we address the trade-off between generalization in multi-task learning (MTL) and precision in single-task learning (STL) by introducing the Multi-Task to Single-Task (MT2ST) framework. MT2ST is designed to enhance training efficiency and accuracy in multi-modal tasks, showcasing its value as a practical application of efficient ML.
View on arXiv@article{liu2025_2406.18038, title={ MT2ST: Adaptive Multi-Task to Single-Task Learning }, author={ Dong Liu and Yanxuan Yu }, journal={arXiv preprint arXiv:2406.18038}, year={ 2025 } }
Comments on this paper