5
0

Heterogeneity-Aware Client Sampling: A Unified Solution for Consistent Federated Learning

Abstract

Federated learning (FL) commonly involves clients with diverse communication and computational capabilities. Such heterogeneity can significantly distort the optimization dynamics and lead to objective inconsistency, where the global model converges to an incorrect stationary point potentially far from the pursued optimum. Despite its critical impact, the joint effect of communication and computation heterogeneity has remained largely unexplored, due to the intrinsic complexity of their interaction. In this paper, we reveal the fundamentally distinct mechanisms through which heterogeneous communication and computation drive inconsistency in FL. To the best of our knowledge, this is the first unified theoretical analysis of general heterogeneous FL, offering a principled understanding of how these two forms of heterogeneity jointly distort the optimization trajectory under arbitrary choices of local solvers. Motivated by these insights, we propose Federated Heterogeneity-Aware Client Sampling, FedACS, a universal method to eliminate all types of objective inconsistency. We theoretically prove that FedACS converges to the correct optimum at a rate of O(1/R)O(1/\sqrt{R}), even in dynamic heterogeneous environments. Extensive experiments across multiple datasets show that FedACS outperforms state-of-the-art and category-specific baselines by 4.3%-36%, while reducing communication costs by 22%-89% and computation loads by 14%-105%, respectively.

View on arXiv
@article{weng2025_2505.11304,
  title={ Heterogeneity-Aware Client Sampling: A Unified Solution for Consistent Federated Learning },
  author={ Shudi Weng and Chao Ren and Ming Xiao and Mikael Skoglund },
  journal={arXiv preprint arXiv:2505.11304},
  year={ 2025 }
}
Comments on this paper