ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.00244
36
1

On The Relative Error of Random Fourier Features for Preserving Kernel Distance

1 October 2022
Kuan Cheng
S. Jiang
Luojian Wei
Zhide Wei
ArXivPDFHTML
Abstract

The method of random Fourier features (RFF), proposed in a seminal paper by Rahimi and Recht (NIPS'07), is a powerful technique to find approximate low-dimensional representations of points in (high-dimensional) kernel space, for shift-invariant kernels. While RFF has been analyzed under various notions of error guarantee, the ability to preserve the kernel distance with \emph{relative} error is less understood. We show that for a significant range of kernels, including the well-known Laplacian kernels, RFF cannot approximate the kernel distance with small relative error using low dimensions. We complement this by showing as long as the shift-invariant kernel is analytic, RFF with poly(ϵ−1log⁡n)\mathrm{poly}(\epsilon^{-1} \log n)poly(ϵ−1logn) dimensions achieves ϵ\epsilonϵ-relative error for pairwise kernel distance of nnn points, and the dimension bound is improved to poly(ϵ−1log⁡k)\mathrm{poly}(\epsilon^{-1}\log k)poly(ϵ−1logk) for the specific application of kernel kkk-means. Finally, going beyond RFF, we make the first step towards data-oblivious dimension-reduction for general shift-invariant kernels, and we obtain a similar poly(ϵ−1log⁡n)\mathrm{poly}(\epsilon^{-1} \log n)poly(ϵ−1logn) dimension bound for Laplacian kernels. We also validate the dimension-error tradeoff of our methods on simulated datasets, and they demonstrate superior performance compared with other popular methods including random-projection and Nystr\"{o}m methods.

View on arXiv
Comments on this paper