162

HollowFlow: Efficient Sample Likelihood Evaluation using Hollow Message Passing

Main:9 Pages
5 Figures
Bibliography:5 Pages
14 Tables
Appendix:11 Pages
Abstract

Flow and diffusion-based models have emerged as powerful tools for scientific applications, particularly for sampling non-normalized probability distributions, as exemplified by Boltzmann Generators (BGs). A critical challenge in deploying these models is their reliance on sample likelihood computations, which scale prohibitively with system size nn, often rendering them infeasible for large-scale problems. To address this, we introduce HollowFlow\textit{HollowFlow}, a flow-based generative model leveraging a novel non-backtracking graph neural network (NoBGNN). By enforcing a block-diagonal Jacobian structure, HollowFlow likelihoods are evaluated with a constant number of backward passes in nn, yielding speed-ups of up to O(n2)\mathcal{O}(n^2): a significant step towards scaling BGs to larger systems. Crucially, our framework generalizes: any equivariant GNN or attention-based architecture\textbf{any equivariant GNN or attention-based architecture} can be adapted into a NoBGNN. We validate HollowFlow by training BGs on two different systems of increasing size. For both systems, the sampling and likelihood evaluation time decreases dramatically, following our theoretical scaling laws. For the larger system we obtain a 102×10^2\times speed-up, clearly illustrating the potential of HollowFlow-based approaches for high-dimensional scientific problems previously hindered by computational bottlenecks.

View on arXiv
Comments on this paper