ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04195
11
8

Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

11 February 2020
Liang Ding
Rui Tuo
Shahin Shahrampour
ArXivPDFHTML
Abstract

Despite their success, kernel methods suffer from a massive computational cost in practice. In this paper, in lieu of commonly used kernel expansion with respect to NNN inputs, we develop a novel optimal design maximizing the entropy among kernel features. This procedure results in a kernel expansion with respect to entropic optimal features (EOF), improving the data representation dramatically due to features dissimilarity. Under mild technical assumptions, our generalization bound shows that with only O(N14)O(N^{\frac{1}{4}})O(N41​) features (disregarding logarithmic factors), we can achieve the optimal statistical accuracy (i.e., O(1/N)O(1/\sqrt{N})O(1/N​)). The salient feature of our design is its sparsity that significantly reduces the time and space cost. Our numerical experiments on benchmark datasets verify the superiority of EOF over the state-of-the-art in kernel approximation.

View on arXiv
Comments on this paper