353

Differentially Private Online Federated Learning with Correlated Noise

IEEE Conference on Decision and Control (CDC), 2024
Main:5 Pages
4 Figures
Bibliography:2 Pages
Appendix:4 Pages
Abstract

We propose a novel differentially private algorithm for online federated learning that employs temporally correlated noise to improve the utility while ensuring the privacy of the continuously released models. To address challenges stemming from DP noise and local updates with streaming noniid data, we develop a perturbed iterate analysis to control the impact of the DP noise on the utility. Moreover, we demonstrate how the drift errors from local updates can be effectively managed under a quasi-strong convexity condition. Subject to an (ϵ,δ)(\epsilon, \delta)-DP budget, we establish a dynamic regret bound over the entire time horizon that quantifies the impact of key parameters and the intensity of changes in dynamic environments. Numerical experiments validate the efficacy of the proposed algorithm.

View on arXiv
Comments on this paper