ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1506.04322
40
5
v1v2 (latest)

Fast Parallel Graphlet Counting for Large Networks

13 June 2015
Nesreen Ahmed
Jennifer Neville
Ryan A. Rossi
N. Duffield
    GNN
ArXiv (abs)PDFHTML
Abstract

From social science to biology, numerous applications often rely on motifs for intuitive and meaningful characterization of networks at both the global macro-level as well as the local micro-level. While motifs have witnessed a tremendous success and impact in a variety of domains, there has yet to be a fast and efficient approach for computing the frequencies of these subgraph patterns. However, existing methods are not scalable to large networks with millions of nodes and edges, which impedes the application of motifs to new problems that require large-scale network analysis. To address these problems, we propose a fast, efficient, and parallel algorithm for counting motifs of size k={3,4}k=\{3,4\}k={3,4}-nodes that take only a fraction of the time to compute when compared with the current methods used. The proposed motif counting algorithms leverages a number of proven combinatorial arguments for different motifs. For each edge, we count a few motifs, and with these counts along with the combinatorial arguments, we obtain the exact counts of others in constant time. On a large collection of 300300300+ networks from a variety of domains, our motif counting strategies are on average 460x faster than current methods. This brings new opportunities to investigate the use of motifs on much larger networks and newer applications as we show in our experiments. To the best of our knowledge, this paper provides the largest motif computations to date as well as the largest systematic investigation on over 300+ networks from a variety of domains.

View on arXiv
Comments on this paper