4

DS FedProxGrad: Asymptotic Stationarity Without Noise Floor in Fair Federated Learning

Huzaifa Arif
Main:9 Pages
Bibliography:1 Pages
1 Tables
Abstract

Recent work \cite{arifgroup} introduced Federated Proximal Gradient \textbf{(\texttt{FedProxGrad})} for solving non-convex composite optimization problems in group fair federated learning. However, the original analysis established convergence only to a \textit{noise-dominated neighborhood of stationarity}, with explicit dependence on a variance-induced noise floor. In this work, we provide an improved asymptotic convergence analysis for a generalized \texttt{FedProxGrad}-type analytical framework with inexact local proximal solutions and explicit fairness regularization. We call this extended analytical framework \textbf{DS \texttt{FedProxGrad}} (Decay Step Size \texttt{FedProxGrad}). Under a Robbins-Monro step-size schedule \cite{robbins1951stochastic} and a mild decay condition on local inexactness, we prove that lim infrE[F(xr)2]=0\liminf_{r\to\infty} \mathbb{E}[\|\nabla F(\mathbf{x}^r)\|^2] = 0, i.e., the algorithm is asymptotically stationary and the convergence rate does not depend on a variance-induced noise floor.

View on arXiv
Comments on this paper