ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.13307
24
1

CORE: Common Random Reconstruction for Distributed Optimization with Provable Low Communication Complexity

23 September 2023
Pengyun Yue
Hanzheng Zhao
Cong Fang
Di He
Liwei Wang
Zhouchen Lin
Song-Chun Zhu
ArXivPDFHTML
Abstract

With distributed machine learning being a prominent technique for large-scale machine learning tasks, communication complexity has become a major bottleneck for speeding up training and scaling up machine numbers. In this paper, we propose a new technique named Common randOm REconstruction(CORE), which can be used to compress the information transmitted between machines in order to reduce communication complexity without other strict conditions. Especially, our technique CORE projects the vector-valued information to a low-dimensional one through common random vectors and reconstructs the information with the same random noises after communication. We apply CORE to two distributed tasks, respectively convex optimization on linear models and generic non-convex optimization, and design new distributed algorithms, which achieve provably lower communication complexities. For example, we show for linear models CORE-based algorithm can encode the gradient vector to O(1)\mathcal{O}(1)O(1)-bits (against O(d)\mathcal{O}(d)O(d)), with the convergence rate not worse, preceding the existing results.

View on arXiv
Comments on this paper