5
0

Enhancing the Performance of Global Model by Improving the Adaptability of Local Models in Federated Learning

Abstract

Federated learning enables the clients to collaboratively train a global model, which is aggregated from local models. Due to the heterogeneous data distributions over clients and data privacy in federated learning, it is difficult to train local models to achieve a well-performed global model. In this paper, we introduce the adaptability of local models, i.e., the average performance of local models on data distributions over clients, and enhance the performance of the global model by improving the adaptability of local models. Since each client does not know the data distributions over other clients, the adaptability of the local model cannot be directly optimized. First, we provide the property of an appropriate local model which has good adaptability on the data distributions over clients. Then, we formalize the property into the local training objective with a constraint and propose a feasible solution to train the local model. Extensive experiments on federated learning benchmarks demonstrate that our method significantly improves the adaptability of local models and achieves a well-performed global model that consistently outperforms the baseline methods.

View on arXiv
@article{zhou2025_2505.10125,
  title={ Enhancing the Performance of Global Model by Improving the Adaptability of Local Models in Federated Learning },
  author={ Wujun Zhou and Shu Ding and ZeLin Li and Wei Wang },
  journal={arXiv preprint arXiv:2505.10125},
  year={ 2025 }
}
Comments on this paper