19
2

Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels

Abstract

Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labelled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimising a graph-Dirichlet energy, equivalently a discrete Sobolev \Wkp21\Wkp{2}{1} semi-norm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimisers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimisers are not continuous. One solution is to consider higher-order regularisation, which is the analogue of minimising Sobolev \Wkps2\Wkp{s}{2} semi-norms. In this paper we consider the asymptotics of minimising a graph variant of the Sobolev \Wkps2\Wkp{s}{2} semi-norm with pointwise constraints. We show that, as expected, one needs s>d/2s>d/2 where dd is the dimension of the data manifold. We also show that there must be an upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behaviour of the minimiser even when s>d/2s>d/2.

View on arXiv
Comments on this paper