ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.16337
31
0

Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity

20 March 2025
Qiankun Shi
Jie Peng
Kun Yuan
Xiao Wang
Qing Ling
ArXivPDFHTML
Abstract

In this paper, we establish tight lower bounds for Byzantine-robust distributed first-order stochastic optimization methods in both strongly convex and non-convex stochastic optimization. We reveal that when the distributed nodes have heterogeneous data, the convergence error comprises two components: a non-vanishing Byzantine error and a vanishing optimization error. We establish the lower bounds on the Byzantine error and on the minimum number of queries to a stochastic gradient oracle required to achieve an arbitrarily small optimization error. Nevertheless, we identify significant discrepancies between our established lower bounds and the existing upper bounds. To fill this gap, we leverage the techniques of Nesterov's acceleration and variance reduction to develop novel Byzantine-robust distributed stochastic optimization methods that provably match these lower bounds, up to logarithmic factors, implying that our established lower bounds are tight.

View on arXiv
@article{shi2025_2503.16337,
  title={ Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity },
  author={ Qiankun Shi and Jie Peng and Kun Yuan and Xiao Wang and Qing Ling },
  journal={arXiv preprint arXiv:2503.16337},
  year={ 2025 }
}
Comments on this paper