16
7

Fast Kernel Summation in High Dimensions via Slicing and Fourier Transforms

Abstract

Kernel-based methods are heavily used in machine learning. However, they suffer from O(N2)O(N^2) complexity in the number NN of considered data points. In this paper, we propose an approximation procedure, which reduces this complexity to O(N)O(N). Our approach is based on two ideas. First, we prove that any radial kernel with analytic basis function can be represented as sliced version of some one-dimensional kernel and derive an analytic formula for the one-dimensional counterpart. It turns out that the relation between one- and dd-dimensional kernels is given by a generalized Riemann-Liouville fractional integral. Hence, we can reduce the dd-dimensional kernel summation to a one-dimensional setting. Second, for solving these one-dimensional problems efficiently, we apply fast Fourier summations on non-equispaced data, a sorting algorithm or a combination of both. Due to its practical importance we pay special attention to the Gaussian kernel, where we show a dimension-independent error bound and represent its one-dimensional counterpart via a closed-form Fourier transform. We provide a run time comparison and error estimate of our fast kernel summations.

View on arXiv
Comments on this paper