Sharp Gaussian approximations for Decentralized Federated Learning

Federated Learning has gained traction in privacy-sensitive collaborative environments, with local SGD emerging as a key optimization method in decentralized settings. While its convergence properties are well-studied, asymptotic statistical guarantees beyond convergence remain limited. In this paper, we present two generalized Gaussian approximation results for local SGD and explore their implications. First, we prove a Berry-Esseen theorem for the final local SGD iterates, enabling valid multiplier bootstrap procedures. Second, motivated by robustness considerations, we introduce two distinct time-uniform Gaussian approximations for the entire trajectory of local SGD. The time-uniform approximations support Gaussian bootstrap-based tests for detecting adversarial attacks. Extensive simulations are provided to support our theoretical results.
View on arXiv@article{bonnerjee2025_2505.08125, title={ Sharp Gaussian approximations for Decentralized Federated Learning }, author={ Soham Bonnerjee and Sayar Karmakar and Wei Biao Wu }, journal={arXiv preprint arXiv:2505.08125}, year={ 2025 } }