ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.15831
21
39

Timely Communication in Federated Learning

31 December 2020
Baturalp Buyukates
S. Ulukus
    FedML
ArXivPDFHTML
Abstract

We consider a federated learning framework in which a parameter server (PS) trains a global model by using nnn clients without actually storing the client data centrally at a cloud server. Focusing on a setting where the client datasets are fast changing and highly temporal in nature, we investigate the timeliness of model updates and propose a novel timely communication scheme. Under the proposed scheme, at each iteration, the PS waits for mmm available clients and sends them the current model. Then, the PS uses the local updates of the earliest kkk out of mmm clients to update the global model at each iteration. We find the average age of information experienced by each client and numerically characterize the age-optimal mmm and kkk values for a given nnn. Our results indicate that, in addition to ensuring timeliness, the proposed communication scheme results in significantly smaller average iteration times compared to random client selection without hurting the convergence of the global learning task.

View on arXiv
Comments on this paper