ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.08241
  4. Cited By
CosSGD: Communication-Efficient Federated Learning with a Simple
  Cosine-Based Quantization

CosSGD: Communication-Efficient Federated Learning with a Simple Cosine-Based Quantization

15 December 2020
Yang He
Hui-Po Wang
M. Zenk
Mario Fritz
    FedML
    MQ
ArXivPDFHTML

Papers citing "CosSGD: Communication-Efficient Federated Learning with a Simple Cosine-Based Quantization"

3 / 3 papers shown
Title
Language Models as Zero-shot Lossless Gradient Compressors: Towards General Neural Parameter Prior Models
Language Models as Zero-shot Lossless Gradient Compressors: Towards General Neural Parameter Prior Models
Hui-Po Wang
Mario Fritz
35
3
0
26 Sep 2024
Towards Federated Learning with On-device Training and Communication in
  8-bit Floating Point
Towards Federated Learning with On-device Training and Communication in 8-bit Floating Point
Bokun Wang
Axel Berg
D. A. E. Acar
Chuteng Zhou
FedML
MQ
48
0
0
02 Jul 2024
ProgFed: Effective, Communication, and Computation Efficient Federated
  Learning by Progressive Training
ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training
Hui-Po Wang
Sebastian U. Stich
Yang He
Mario Fritz
FedML
AI4CE
36
46
0
11 Oct 2021
1