Fast Online Node Labeling for Very Large Graphs

This paper studies the online node classification problem under a transductive learning setting. Current methods either invert a graph kernel matrix with runtime and space complexity or sample a large volume of random spanning trees, thus are difficult to scale to large graphs. In this work, we propose an improvement based on the \textit{online relaxation} technique introduced by a series of works (Rakhlin et al.,2012; Rakhlin and Sridharan, 2015; 2017). We first prove an effective regret when suitable parameterized graph kernels are chosen, then propose an approximate algorithm FastONL enjoying regret based on this relaxation. The key of FastONL is a \textit{generalized local push} method that effectively approximates inverse matrix columns and applies to a series of popular kernels. Furthermore, the per-prediction cost is locally dependent on the graph with linear memory cost. Experiments show that our scalable method enjoys a better tradeoff between local and global consistency.
View on arXiv