FedSAUC: A Similarity-Aware Update Control for Communication-Efficient Federated Learning in Edge Computing

Federated learning is a distributed machine learning framework to collaboratively train a global model without uploading privacy-sensitive data onto a centralized server. Usually, this framework is applied to edge devices such as smartphones, wearable devices, and Internet of Things (IoT) devices which closely collect information from users. However, these devices are mostly battery-powered. The update procedure of federated learning will constantly consume the battery power and the transmission bandwidth. In this work, we propose an update control for federated learning, FedSAUC, by considering the similarity of users' behaviors (models). At the server side, we exploit clustering algorithms to group devices with similar models. Then we select some representatives for each cluster to update information to train the model. We also implemented a testbed prototyping on edge devices for validating the performance. The experimental results show that this update control will not affect the training accuracy in the long run.
View on arXiv@article{lee2025_2504.04867, title={ FedSAUC: A Similarity-Aware Update Control for Communication-Efficient Federated Learning in Edge Computing }, author={ Ming-Lun Lee and Han-Chang Chou and Yan-AnnChen }, journal={arXiv preprint arXiv:2504.04867}, year={ 2025 } }