ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.12201
  4. Cited By
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL
  Training

GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training

20 May 2023
S. Tyagi
Martin Swany
ArXivPDFHTML

Papers citing "GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training"

6 / 6 papers shown
Title
OmniLearn: A Framework for Distributed Deep Learning over Heterogeneous Clusters
OmniLearn: A Framework for Distributed Deep Learning over Heterogeneous Clusters
S. Tyagi
Prateek Sharma
63
0
0
21 Mar 2025
Lossless and Near-Lossless Compression for Foundation Models
Lossless and Near-Lossless Compression for Foundation Models
Moshik Hershcovitch
Leshem Choshen
Andrew Wood
Ilias Enmouri
Peter Chin
S. Sundararaman
Danny Harnik
49
6
0
05 Apr 2024
Flexible Communication for Optimal Distributed Learning over
  Unpredictable Networks
Flexible Communication for Optimal Distributed Learning over Unpredictable Networks
S. Tyagi
Martin Swany
37
1
0
05 Dec 2023
Accelerating Distributed ML Training via Selective Synchronization
Accelerating Distributed ML Training via Selective Synchronization
S. Tyagi
Martin Swany
FedML
24
3
0
16 Jul 2023
Scavenger: A Cloud Service for Optimizing Cost and Performance of ML
  Training
Scavenger: A Cloud Service for Optimizing Cost and Performance of ML Training
S. Tyagi
Prateek Sharma
16
5
0
12 Mar 2023
An Efficient Statistical-based Gradient Compression Technique for
  Distributed Training Systems
An Efficient Statistical-based Gradient Compression Technique for Distributed Training Systems
A. Abdelmoniem
Ahmed Elzanaty
Mohamed-Slim Alouini
Marco Canini
49
74
0
26 Jan 2021
1