39
0

Tuning Algorithmic and Architectural Hyperparameters in Graph-Based Semi-Supervised Learning with Provable Guarantees

Abstract

Graph-based semi-supervised learning is a powerful paradigm in machine learning for modeling and exploiting the underlying graph structure that captures the relationship between labeled and unlabeled data. A large number of classical as well as modern deep learning based algorithms have been proposed for this problem, often having tunable hyperparameters. We initiate a formal study of tuning algorithm hyperparameters from parameterized algorithm families for this problem. We obtain novel O(logn)O(\log n) pseudo-dimension upper bounds for hyperparameter selection in three classical label propagation-based algorithm families, where nn is the number of nodes, implying bounds on the amount of data needed for learning provably good parameters. We further provide matching Ω(logn)\Omega(\log n) pseudo-dimension lower bounds, thus asymptotically characterizing the learning-theoretic complexity of the parameter tuning problem. We extend our study to selecting architectural hyperparameters in modern graph neural networks. We bound the Rademacher complexity for tuning the self-loop weighting in recently proposed Simplified Graph Convolution (SGC) networks. We further propose a tunable architecture that interpolates graph convolutional neural networks (GCN) and graph attention networks (GAT) in every layer, and provide Rademacher complexity bounds for tuning the interpolation coefficient.

View on arXiv
@article{du2025_2502.12937,
  title={ Tuning Algorithmic and Architectural Hyperparameters in Graph-Based Semi-Supervised Learning with Provable Guarantees },
  author={ Ally Yalei Du and Eric Huang and Dravyansh Sharma },
  journal={arXiv preprint arXiv:2502.12937},
  year={ 2025 }
}
Comments on this paper