ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.02580
  4. Cited By
Communication Efficient Federated Learning via Ordered ADMM in a Fully
  Decentralized Setting

Communication Efficient Federated Learning via Ordered ADMM in a Fully Decentralized Setting

5 February 2022
Yicheng Chen
Rick S. Blum
Brian M. Sadler
    FedML
ArXivPDFHTML

Papers citing "Communication Efficient Federated Learning via Ordered ADMM in a Fully Decentralized Setting"

4 / 4 papers shown
Title
AdapterDistillation: Non-Destructive Task Composition with Knowledge
  Distillation
AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation
Junjie Wang
Yicheng Chen
Wangshu Zhang
Sen Hu
Teng Xu
Jing Zheng
VLM
37
1
0
26 Dec 2023
Communication-Efficient Heterogeneous Federated Learning with
  Generalized Heavy-Ball Momentum
Communication-Efficient Heterogeneous Federated Learning with Generalized Heavy-Ball Momentum
Riccardo Zaccone
Carlo Masone
Marco Ciccone
FedML
35
2
0
30 Nov 2023
Personalized Graph Federated Learning with Differential Privacy
Personalized Graph Federated Learning with Differential Privacy
François Gauthier
Vinay Chakravarthi Gogineni
Stefan Werner
Yih-Fang Huang
A. Kuh
FedML
26
7
0
10 Jun 2023
Communication-Efficient {Federated} Learning Using Censored Heavy Ball
  Descent
Communication-Efficient {Federated} Learning Using Censored Heavy Ball Descent
Yicheng Chen
Rick S. Blum
Brian M. Sadler
FedML
28
4
0
24 Sep 2022
1