ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.03770
  4. Cited By
Fed-CVLC: Compressing Federated Learning Communications with
  Variable-Length Codes

Fed-CVLC: Compressing Federated Learning Communications with Variable-Length Codes

6 February 2024
Xiaoxin Su
Yipeng Zhou
Laizhong Cui
John C. S. Lui
Jiangchuan Liu
    FedML
ArXivPDFHTML

Papers citing "Fed-CVLC: Compressing Federated Learning Communications with Variable-Length Codes"

2 / 2 papers shown
Title
An Efficient Statistical-based Gradient Compression Technique for
  Distributed Training Systems
An Efficient Statistical-based Gradient Compression Technique for Distributed Training Systems
A. Abdelmoniem
Ahmed Elzanaty
Mohamed-Slim Alouini
Marco Canini
49
73
0
26 Jan 2021
Adaptive Federated Learning in Resource Constrained Edge Computing
  Systems
Adaptive Federated Learning in Resource Constrained Edge Computing Systems
Shiqiang Wang
Tiffany Tuor
Theodoros Salonidis
K. Leung
C. Makaya
T. He
Kevin S. Chan
141
1,663
0
14 Apr 2018
1