ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.07829
  4. Cited By
Compressed Communication for Distributed Training: Adaptive Methods and
  System

Compressed Communication for Distributed Training: Adaptive Methods and System

17 May 2021
Yuchen Zhong
Cong Xie
Shuai Zheng
Haibin Lin
ArXivPDFHTML

Papers citing "Compressed Communication for Distributed Training: Adaptive Methods and System"

3 / 3 papers shown
Title
Rethinking gradient sparsification as total error minimization
Rethinking gradient sparsification as total error minimization
Atal Narayan Sahu
Aritra Dutta
A. Abdelmoniem
Trambak Banerjee
Marco Canini
Panos Kalnis
32
53
0
02 Aug 2021
An Efficient Statistical-based Gradient Compression Technique for
  Distributed Training Systems
An Efficient Statistical-based Gradient Compression Technique for Distributed Training Systems
A. Abdelmoniem
Ahmed Elzanaty
Mohamed-Slim Alouini
Marco Canini
49
73
0
26 Jan 2021
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
223
4,424
0
23 Jan 2020
1