148

On the Sheafification of Higher-Order Message Passing

Main:42 Pages
24 Figures
Bibliography:3 Pages
2 Tables
Abstract

Recent work in Topological Deep Learning (TDL) seeks to generalize graph learning's preeminent message passingmessage \ passing paradigm to more complex relational structures: simplicial complexes, cell complexes, hypergraphs, and combinations thereof. Many approaches to such higher-order message passing{higher\text{-}order \ message \ passing} (HOMP) admit formulation in terms of nonlinear diffusion with the Hodge (combinatorial) Laplacian, a graded operator which carries an inductive bias that dimension-kk data features correlate with dimension-kk topological features encoded in the (singular) cohomology of the underlying domain. For k=0k=0 this recovers the graph Laplacian and its well-studied homophily bias. In higher gradings, however, the Hodge Laplacian's bias is more opaque and potentially even degenerate. In this essay, we position sheaf theory as a natural and principled formalism for modifying the Hodge Laplacian's diffusion-mediated interface between local and global descriptors toward more expressive message passing. The sheaf Laplacian's inductive bias correlates dimension-kk data features with dimension-kk sheafsheaf cohomology, a data-aware generalization of singular cohomology. We will contextualize and novelly extend prior theory on sheaf diffusion in graph learning (k=0k=0) in such a light -- and explore how it fails to generalize to k>0k>0 -- before developing novel theory and practice for the higher-order setting. Our exposition is accompanied by a self-contained introduction shepherding sheaves from the abstract to the applied.

View on arXiv
Comments on this paper