ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.06706
  4. Cited By
MixML: A Unified Analysis of Weakly Consistent Parallel Learning

MixML: A Unified Analysis of Weakly Consistent Parallel Learning

14 May 2020
Yucheng Lu
J. Nash
Christopher De Sa
    FedML
ArXivPDFHTML

Papers citing "MixML: A Unified Analysis of Weakly Consistent Parallel Learning"

2 / 2 papers shown
Title
Maximizing Communication Efficiency for Large-scale Training via 0/1
  Adam
Maximizing Communication Efficiency for Large-scale Training via 0/1 Adam
Yucheng Lu
Conglong Li
Minjia Zhang
Christopher De Sa
Yuxiong He
OffRL
AI4CE
22
20
0
12 Feb 2022
Optimal Complexity in Decentralized Training
Optimal Complexity in Decentralized Training
Yucheng Lu
Christopher De Sa
25
71
0
15 Jun 2020
1