ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.03657
  4. Cited By
FedMAX: Mitigating Activation Divergence for Accurate and
  Communication-Efficient Federated Learning

FedMAX: Mitigating Activation Divergence for Accurate and Communication-Efficient Federated Learning

7 April 2020
Wei Chen
Kartikeya Bhardwaj
R. Marculescu
    FedML
ArXivPDFHTML

Papers citing "FedMAX: Mitigating Activation Divergence for Accurate and Communication-Efficient Federated Learning"

4 / 4 papers shown
Title
Few-Shot Class-Incremental Learning with Non-IID Decentralized Data
Few-Shot Class-Incremental Learning with Non-IID Decentralized Data
Cuiwei Liu
Siang Xu
Huaijun Qiu
Jing Zhang
Zhi Liu
Liang Zhao
CLL
36
0
0
18 Sep 2024
Deep Class Incremental Learning from Decentralized Data
Deep Class Incremental Learning from Decentralized Data
Xiaohan Zhang
Songlin Dong
Jinjie Chen
Qiaoling Tian
Yihong Gong
Xiaopeng Hong
CLL
31
11
0
11 Mar 2022
Accelerating Federated Learning with a Global Biased Optimiser
Accelerating Federated Learning with a Global Biased Optimiser
Jed Mills
Jia Hu
Geyong Min
Rui Jin
Siwei Zheng
Jin Wang
FedML
AI4CE
34
9
0
20 Aug 2021
Communication Optimization in Large Scale Federated Learning using
  Autoencoder Compressed Weight Updates
Communication Optimization in Large Scale Federated Learning using Autoencoder Compressed Weight Updates
Srikanth Chandar
Pravin Chandran
Raghavendra Bhat
Avinash Chakravarthi
AI4CE
33
3
0
12 Aug 2021
1