ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.10123
  4. Cited By
Homomorphic Parameter Compression for Distributed Deep Learning Training

Homomorphic Parameter Compression for Distributed Deep Learning Training

28 November 2017
Jaehee Jang
Byunggook Na
Sungroh Yoon
    FedML
ArXivPDFHTML

Papers citing "Homomorphic Parameter Compression for Distributed Deep Learning Training"

1 / 1 papers shown
Title
Distributed Training of Deep Neural Networks: Theoretical and Practical
  Limits of Parallel Scalability
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
47
97
0
22 Sep 2016
1