ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.03174
  4. Cited By
Counting Substructures with Higher-Order Graph Neural Networks:
  Possibility and Impossibility Results
v1v2 (latest)

Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results

6 December 2020
B. Tahmasebi
Derek Lim
Stefanie Jegelka
    GNN
ArXiv (abs)PDFHTML

Papers citing "Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results"

14 / 14 papers shown
Balancing Efficiency and Expressiveness: Subgraph GNNs with Walk-Based Centrality
Balancing Efficiency and Expressiveness: Subgraph GNNs with Walk-Based Centrality
Joshua Southern
Yam Eitan
Guy Bar-Shalom
Michael M. Bronstein
Haggai Maron
Fabrizio Frasca
385
4
0
06 Jan 2025
Expressivity of Graph Neural Networks Through the Lens of Adversarial
  Robustness
Expressivity of Graph Neural Networks Through the Lens of Adversarial Robustness
Francesco Campi
Lukas Gosch
Thomas Wollschläger
Yan Scholten
Stephan Günnemann
AAML
226
2
0
16 Aug 2023
Weisfeiler and Leman Go Measurement Modeling: Probing the Validity of
  the WL Test
Weisfeiler and Leman Go Measurement Modeling: Probing the Validity of the WL Test
Arjun Subramonian
Adina Williams
Maximilian Nickel
Yizhou Sun
Levent Sagun
327
0
0
11 Jul 2023
An Efficient Subgraph GNN with Provable Substructure Counting Power
An Efficient Subgraph GNN with Provable Substructure Counting PowerKnowledge Discovery and Data Mining (KDD), 2023
Zuoyu Yan
Junru Zhou
Liangcai Gao
Zhi Tang
Muhan Zhang
GNN
251
20
0
19 Mar 2023
Equivariant Polynomials for Graph Neural Networks
Equivariant Polynomials for Graph Neural NetworksInternational Conference on Machine Learning (ICML), 2023
Omri Puny
Derek Lim
B. Kiani
Haggai Maron
Y. Lipman
319
41
0
22 Feb 2023
Boosting the Cycle Counting Power of Graph Neural Networks with
  I$^2$-GNNs
Boosting the Cycle Counting Power of Graph Neural Networks with I2^22-GNNsInternational Conference on Learning Representations (ICLR), 2022
Yinan Huang
Xingang Peng
Jianzhu Ma
Muhan Zhang
461
58
0
22 Oct 2022
Representation Power of Graph Neural Networks: Improved Expressivity via
  Algebraic Analysis
Representation Power of Graph Neural Networks: Improved Expressivity via Algebraic Analysis
Charilaos I. Kanatsoulis
Alejandro Ribeiro
318
6
0
19 May 2022
Theory of Graph Neural Networks: Representation and Learning
Theory of Graph Neural Networks: Representation and Learning
Stefanie Jegelka
GNNAI4CE
231
84
0
16 Apr 2022
A Survey on Machine Learning Solutions for Graph Pattern Extraction
A Survey on Machine Learning Solutions for Graph Pattern Extraction
Kai Siong Yow
Ningyi Liao
Siqiang Luo
Reynold Cheng
Chenhao Ma
Xiaolin Han
405
3
0
03 Apr 2022
Sign and Basis Invariant Networks for Spectral Graph Representation
  Learning
Sign and Basis Invariant Networks for Spectral Graph Representation LearningInternational Conference on Learning Representations (ICLR), 2022
Derek Lim
Joshua Robinson
Lingxiao Zhao
Tess E. Smidt
S. Sra
Haggai Maron
Stefanie Jegelka
721
191
0
25 Feb 2022
From Stars to Subgraphs: Uplifting Any GNN with Local Structure
  Awareness
From Stars to Subgraphs: Uplifting Any GNN with Local Structure Awareness
Lingxiao Zhao
Wei Jin
Leman Akoglu
Neil Shah
GNN
481
202
0
07 Oct 2021
Equivariant Subgraph Aggregation Networks
Equivariant Subgraph Aggregation Networks
Beatrice Bevilacqua
Fabrizio Frasca
Derek Lim
Ninad Kulkarni
Chen Cai
G. Balamurugan
M. Bronstein
Haggai Maron
438
213
0
06 Oct 2021
Graph Neural Networks with Local Graph Parameters
Graph Neural Networks with Local Graph ParametersNeural Information Processing Systems (NeurIPS), 2021
Pablo Barceló
Floris Geerts
Juan L. Reutter
Maksimilian Ryschkov
166
78
0
12 Jun 2021
Scalars are universal: Equivariant machine learning, structured like
  classical physics
Scalars are universal: Equivariant machine learning, structured like classical physicsNeural Information Processing Systems (NeurIPS), 2021
Soledad Villar
D. Hogg
Kate Storey-Fisher
Weichi Yao
Ben Blum-Smith
PINNAI4CE
390
154
0
11 Jun 2021
1
Page 1 of 1