ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.13866
  4. Cited By
Minimal Variance Sampling with Provable Guarantees for Fast Training of
  Graph Neural Networks

Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks

24 June 2020
Weilin Cong
R. Forsati
M. Kandemir
M. Mahdavi
ArXivPDFHTML

Papers citing "Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks"

4 / 4 papers shown
Title
LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation
LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation
Rui Xue
Haoyu Han
MohamadAli Torkamani
Jian Pei
Xiaorui Liu
GNN
17
18
0
03 Feb 2023
BNS-GCN: Efficient Full-Graph Training of Graph Convolutional Networks
  with Partition-Parallelism and Random Boundary Node Sampling
BNS-GCN: Efficient Full-Graph Training of Graph Convolutional Networks with Partition-Parallelism and Random Boundary Node Sampling
Cheng Wan
Youjie Li
Ang Li
Namjae Kim
Yingyan Lin
GNN
21
75
0
21 Mar 2022
On Provable Benefits of Depth in Training Graph Convolutional Networks
On Provable Benefits of Depth in Training Graph Convolutional Networks
Weilin Cong
M. Ramezani
M. Mahdavi
16
73
0
28 Oct 2021
Sampling methods for efficient training of graph convolutional networks:
  A survey
Sampling methods for efficient training of graph convolutional networks: A survey
Xin Liu
Mingyu Yan
Lei Deng
Guoqi Li
Xiaochun Ye
Dongrui Fan
GNN
21
95
0
10 Mar 2021
1