105

Iterative Methods via Locally Evolving Set Process

Neural Information Processing Systems (NeurIPS), 2024
Main:9 Pages
19 Figures
Bibliography:4 Pages
7 Tables
Appendix:45 Pages
Abstract

Given the damping factor α\alpha and precision tolerance ϵ\epsilon, \citet{andersen2006local} introduced Approximate Personalized PageRank (APPR), the \textit{de facto local method} for approximating the PPR vector, with runtime bounded by Θ(1/(αϵ))\Theta(1/(\alpha\epsilon)) independent of the graph size. Recently, \citet{fountoulakis2022open} asked whether faster local algorithms could be developed using O~(1/(αϵ))\tilde{O}(1/(\sqrt{\alpha}\epsilon)) operations. By noticing that APPR is a local variant of Gauss-Seidel, this paper explores the question of \textit{whether standard iterative solvers can be effectively localized}. We propose to use the \textit{locally evolving set process}, a novel framework to characterize the algorithm locality, and demonstrate that many standard solvers can be effectively localized. Let vol(St)\overline{\operatorname{vol}}{ (S_t)} and γt\overline{\gamma}_{t} be the running average of volume and the residual ratio of active nodes St\textstyle S_{t} during the process. We show vol(St)/γt1/ϵ\overline{\operatorname{vol}}{ (S_t)}/\overline{\gamma}_{t} \leq 1/\epsilon and prove APPR admits a new runtime bound O~(vol(St)/(αγt))\tilde{O}(\overline{\operatorname{vol}}(S_t)/(\alpha\overline{\gamma}_{t})) mirroring the actual performance. Furthermore, when the geometric mean of residual reduction is Θ(α)\Theta(\sqrt{\alpha}), then there exists c(0,2)c \in (0,2) such that the local Chebyshev method has runtime O~(vol(St)/(α(2c)))\tilde{O}(\overline{\operatorname{vol}}(S_{t})/(\sqrt{\alpha}(2-c))) without the monotonicity assumption. Numerical results confirm the efficiency of this novel framework and show up to a hundredfold speedup over corresponding standard solvers on real-world graphs.

View on arXiv
Comments on this paper