Decentralized Optimization with Topology-Independent Communication

Distributed optimization requires nodes to coordinate, yet full synchronization scales poorly. When nodes collaborate through pairwise regularizers, standard methods demand communications per iteration. This paper proposes randomized local coordination: each node independently samples one regularizer uniformly and coordinates only with nodes sharing that term. This exploits partial separability, where each regularizer depends on a subset of nodes. For graph-guided regularizers where , expected communication drops to exactly 2 messages per iteration. This method achieves iterations for convex objectives and under strong convexity, to an -solution and to a neighborhood. Replacing the proximal map of the sum with the proximal map of a single randomly selected regularizer preserves convergence while eliminating global coordination. Experiments validate both convergence rates and communication efficiency across synthetic and real-world datasets.
View on arXiv