8
0

A Randomized Zeroth-Order Hierarchical Framework for Heterogeneous Federated Learning

Abstract

Heterogeneity in federated learning (FL) is a critical and challenging aspect that significantly impacts model performance and convergence. In this paper, we propose a novel framework by formulating heterogeneous FL as a hierarchical optimization problem. This new framework captures both local and global training process through a bilevel formulation and is capable of the following: (i) addressing client heterogeneity through a personalized learning framework; (ii) capturing pre-training process on server's side; (iii) updating global model through nonstandard aggregation; (iv) allowing for nonidentical local steps; and (v) capturing clients' local constraints. We design and analyze an implicit zeroth-order FL method (ZO-HFL), provided with nonasymptotic convergence guarantees for both the server-agent and the individual client-agents, and asymptotic guarantees for both the server-agent and client-agents in an almost sure sense. Notably, our method does not rely on standard assumptions in heterogeneous FL, such as the bounded gradient dissimilarity condition. We implement our method on image classification tasks and compare with other methods under different heterogeneous settings.

View on arXiv
@article{qiu2025_2504.01839,
  title={ A Randomized Zeroth-Order Hierarchical Framework for Heterogeneous Federated Learning },
  author={ Yuyang Qiu and Kibaek Kim and Farzad Yousefian },
  journal={arXiv preprint arXiv:2504.01839},
  year={ 2025 }
}
Comments on this paper