ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.11187
73
75
v1v2v3v4v5v6 (latest)

A Communication Efficient Collaborative Learning Framework for Distributed Features

24 December 2019
Yang Liu
Yan Kang
Xinwei Zhang
Liping Li
Yong Cheng
Tianjian Chen
Mingyi Hong
Qiang Yang
    FedML
ArXiv (abs)PDFHTML
Abstract

We introduce a collaborative learning framework allowing multiple parties having different sets of attributes about the same user to jointly build models without exposing their raw data or model parameters. In particular, we propose a Federated Stochastic Block Coordinate Descent (FedBCD) algorithm, in which each party conducts multiple local updates before each communication to effectively reduce the number of communication rounds among parties, a principal bottleneck for collaborative learning problems. We analyze theoretically the impact of the number of local updates and show that when the batch size, sample size, and the local iterations are selected appropriately, within TTT iterations, the algorithm performs O(T)\mathcal{O}(\sqrt{T})O(T​) communication rounds and achieves some O(1/T)\mathcal{O}(1/\sqrt{T})O(1/T​) accuracy (measured by the average of the gradient norm squared). The approach is supported by our empirical evaluations on a variety of tasks and datasets, demonstrating advantages over stochastic gradient descent (SGD) approaches.

View on arXiv
Comments on this paper