676

Adaptive Local Steps Federated Learning with Differential Privacy Driven by Convergence Analysis

IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM), 2023
Main:8 Pages
5 Figures
Bibliography:2 Pages
5 Tables
Appendix:3 Pages
Abstract

Federated Learning (FL) is a distributed machine learning technique that allows model training among multiple devices or organizations without sharing data. However, while FL ensures that the raw data is not directly accessible to external adversaries, adversaries can still obtain some statistical information about the data through differential attacks. Differential Privacy (DP) has been proposed, which adds noise to the model or gradients to prevent adversaries from inferring private information from the transmitted parameters. We reconsider the framework of differential privacy federated learning in resource-constrained scenarios (privacy budget and communication resources). We analyze the convergence of federated learning with differential privacy (DPFL) on resource-constrained scenarios and propose an Adaptive Local Steps Differential Privacy Federated Learning (ALS-DPFL) algorithm. We experiment our algorithm on the FashionMNIST and Cifar-10 datasets and achieve quite good performance relative to previous work.

View on arXiv
Comments on this paper