ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.03815
  4. Cited By
Expediting In-Network Federated Learning by Voting-Based Consensus Model
  Compression

Expediting In-Network Federated Learning by Voting-Based Consensus Model Compression

6 February 2024
Xiaoxin Su
Yipeng Zhou
Laizhong Cui
Song Guo
    FedML
ArXivPDFHTML

Papers citing "Expediting In-Network Federated Learning by Voting-Based Consensus Model Compression"

4 / 4 papers shown
Title
Transformer in Transformer
Transformer in Transformer
Kai Han
An Xiao
Enhua Wu
Jianyuan Guo
Chunjing Xu
Yunhe Wang
ViT
284
1,524
0
27 Feb 2021
An Efficient Statistical-based Gradient Compression Technique for
  Distributed Training Systems
An Efficient Statistical-based Gradient Compression Technique for Distributed Training Systems
A. Abdelmoniem
Ahmed Elzanaty
Mohamed-Slim Alouini
Marco Canini
51
74
0
26 Jan 2021
Ternary Compression for Communication-Efficient Federated Learning
Ternary Compression for Communication-Efficient Federated Learning
Jinjin Xu
W. Du
Ran Cheng
Wangli He
Yaochu Jin
MQ
FedML
39
174
0
07 Mar 2020
Adaptive Federated Learning in Resource Constrained Edge Computing
  Systems
Adaptive Federated Learning in Resource Constrained Edge Computing Systems
Shiqiang Wang
Tiffany Tuor
Theodoros Salonidis
K. Leung
C. Makaya
T. He
Kevin S. Chan
144
1,687
0
14 Apr 2018
1