56
11

LocalNysation: Combining localized kernel regression and Nyström subsampling

Abstract

We consider a localized approach in the well-established setting of reproducing kernel learning under random design. The input space XX is partitioned into local disjoint subsets XjX_j (j=1,...,mj=1,...,m) equipped with a local reproducing kernel KjK_j. It is then straightforward to define local KRR estimates. Our first main contribution is in showing that minimax optimal rates of convergence are preserved if the number mm of partitions grows sufficiently slowly with the sample size, under locally different degrees on smoothness assumptions on the regression function. As a byproduct, we show that low smoothness on exceptional sets of small probability does not contribute, leading to a faster rate of convergence. Our second contribution lies in showing that the partitioning approach for KRR can be efficiently combined with local Nystr\"om subsampling, improving computational cost twofold. If the number of locally subsampled inputs grows sufficiently fast with the sample size, minimax optimal rates of convergence are maintained.

View on arXiv
Comments on this paper