ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.07706
  4. Cited By
Communication-Efficient Sampling for Distributed Training of Graph
  Convolutional Networks

Communication-Efficient Sampling for Distributed Training of Graph Convolutional Networks

19 January 2021
Peng Jiang
Masuma Akter Rumi
    GNN
ArXivPDFHTML

Papers citing "Communication-Efficient Sampling for Distributed Training of Graph Convolutional Networks"

2 / 2 papers shown
Title
Solving Stochastic Compositional Optimization is Nearly as Easy as
  Solving Stochastic Optimization
Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
Tianyi Chen
Yuejiao Sun
W. Yin
44
81
0
25 Aug 2020
Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph
  Neural Networks
Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks
Minjie Wang
Da Zheng
Zihao Ye
Quan Gan
Mufei Li
...
J. Zhao
Haotong Zhang
Alex Smola
Jinyang Li
Zheng-Wei Zhang
AI4CE
GNN
184
731
0
03 Sep 2019
1