5
0

Asynchronous Decentralized SGD under Non-Convexity: A Block-Coordinate Descent Framework

Abstract

Decentralized optimization has become vital for leveraging distributed data without central control, enhancing scalability and privacy. However, practical deployments face fundamental challenges due to heterogeneous computation speeds and unpredictable communication delays. This paper introduces a refined model of Asynchronous Decentralized Stochastic Gradient Descent (ADSGD) under practical assumptions of bounded computation and communication times. To understand the convergence of ADSGD, we first analyze Asynchronous Stochastic Block Coordinate Descent (ASBCD) as a tool, and then show that ADSGD converges under computation-delay-independent step sizes. The convergence result is established without assuming bounded data heterogeneity. Empirical experiments reveal that ADSGD outperforms existing methods in wall-clock convergence time across various scenarios. With its simplicity, efficiency in memory and communication, and resilience to communication and computation delays, ADSGD is well-suited for real-world decentralized learning tasks.

View on arXiv
@article{zhou2025_2505.10322,
  title={ Asynchronous Decentralized SGD under Non-Convexity: A Block-Coordinate Descent Framework },
  author={ Yijie Zhou and Shi Pu },
  journal={arXiv preprint arXiv:2505.10322},
  year={ 2025 }
}
Comments on this paper