ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.01751
344
79
v1v2v3v4v5 (latest)

Near-Optimal Coresets of Kernel Density Estimates

6 February 2018
J. M. Phillips
W. Tai
ArXiv (abs)PDFHTML
Abstract

We construct near-optimal coresets for kernel density estimate for points in Rd\mathbb{R^d}Rd when the kernel is positive definite. Specifically we show a polynomial time construction for a coreset of size O(dlog⁡(1/ϵ)/ϵ)O(\sqrt{d\log (1/\epsilon)}/\epsilon)O(dlog(1/ϵ)​/ϵ), and we show a near-matching lower bound of size Ω(d/ϵ)\Omega(\sqrt{d}/\epsilon)Ω(d​/ϵ). The upper bound is a polynomial in 1/ϵ1/\epsilon1/ϵ improvement when d∈[3,1/ϵ2)d \in [3,1/\epsilon^2)d∈[3,1/ϵ2) (for all kernels except the Gaussian kernel which had a previous upper bound of O((1/ϵ)log⁡d(1/ϵ))O((1/\epsilon) \log^d (1/\epsilon))O((1/ϵ)logd(1/ϵ))) and the lower bound is the first known lower bound to depend on ddd for this problem. Moreover, the upper bound restriction that the kernel is positive definite is significant in that it applies to a wide-variety of kernels, specifically those most important for machine learning. This includes kernels for information distances and the sinc kernel which can be negative.

View on arXiv
Comments on this paper