48
0

Locality Regularized Reconstruction: Structured Sparsity and Delaunay Triangulations

Abstract

Linear representation learning is widely studied due to its conceptual simplicity and empirical utility in tasks such as compression, classification, and feature extraction. Given a set of points [x1,x2,,xn]=XRd×n[\mathbf{x}_1, \mathbf{x}_2, \ldots, \mathbf{x}_n] = \mathbf{X} \in \mathbb{R}^{d \times n} and a vector yRd\mathbf{y} \in \mathbb{R}^d, the goal is to find coefficients wRn\mathbf{w} \in \mathbb{R}^n so that Xwy\mathbf{X} \mathbf{w} \approx \mathbf{y}, subject to some desired structure on w\mathbf{w}. In this work we seek w\mathbf{w} that forms a local reconstruction of y\mathbf{y} by solving a regularized least squares regression problem. We obtain local solutions through a locality function that promotes the use of columns of X\mathbf{X} that are close to y\mathbf{y} when used as a regularization term. We prove that, for all levels of regularization and under a mild condition that the columns of X\mathbf{X} have a unique Delaunay triangulation, the optimal coefficients' number of non-zero entries is upper bounded by d+1d+1, thereby providing local sparse solutions when dnd \ll n. Under the same condition we also show that for any y\mathbf{y} contained in the convex hull of X\mathbf{X} there exists a regime of regularization parameter such that the optimal coefficients are supported on the vertices of the Delaunay simplex containing y\mathbf{y}. This provides an interpretation of the sparsity as having structure obtained implicitly from the Delaunay triangulation of X\mathbf{X}. We demonstrate that our locality regularized problem can be solved in comparable time to other methods that identify the containing Delaunay simplex.

View on arXiv
Comments on this paper