253

Surrogate to Poincaré inequalities on manifolds for dimension reduction in nonlinear feature spaces

Main:39 Pages
7 Figures
Bibliography:4 Pages
Abstract

We aim to approximate a continuously differentiable function u:RdRu:\mathbb{R}^d \rightarrow \mathbb{R} by a composition of functions fgf\circ g where g:RdRmg:\mathbb{R}^d \rightarrow \mathbb{R}^m, mdm\leq d, and f:RmRf : \mathbb{R}^m \rightarrow \mathbb{R} are built in a two stage procedure. For a fixed gg, we build ff using classical regression methods, involving evaluations of uu. Recent works proposed to build a nonlinear gg by minimizing a loss function J(g)\mathcal{J}(g) derived from Poincaré inequalities on manifolds, involving evaluations of the gradient of uu. A problem is that minimizing J\mathcal{J} may be a challenging task. Hence in this work, we introduce new convex surrogates to J\mathcal{J}. Leveraging concentration inequalities, we provide sub-optimality results for a class of functions gg, including polynomials, and a wide class of input probability measures. We investigate performances on different benchmarks for various training sample sizes. We show that our approach outperforms standard iterative methods for minimizing the training Poincaré inequality based loss, often resulting in better approximation errors, especially for rather small training sets and m=1m=1.

View on arXiv
Comments on this paper