ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.02540
24
0

Even Faster Kernel Matrix Linear Algebra via Density Estimation

2 October 2025
Rikhav Shah
Sandeep Silwal
Haike Xu
ArXiv (abs)PDFHTML
Main:37 Pages
4 Figures
Bibliography:3 Pages
3 Tables
Abstract

This paper studies the use of kernel density estimation (KDE) for linear algebraic tasks involving the kernel matrix of a collection of nnn data points in Rd\mathbb R^dRd. In particular, we improve upon existing algorithms for computing the following up to (1+ε)(1+\varepsilon)(1+ε) relative error: matrix-vector products, matrix-matrix products, the spectral norm, and sum of all entries. The runtimes of our algorithms depend on the dimension ddd, the number of points nnn, and the target error ε\varepsilonε. Importantly, the dependence on nnn in each case is far lower when accessing the kernel matrix through KDE queries as opposed to reading individual entries.Our improvements over existing best algorithms (particularly those of Backurs, Indyk, Musco, and Wagner '21) for these tasks reduce the polynomial dependence on ε\varepsilonε, and additionally decreases the dependence on nnn in the case of computing the sum of all entries of the kernel matrix.We complement our upper bounds with several lower bounds for related problems, which provide (conditional) quadratic time hardness results and additionally hint at the limits of KDE based approaches for the problems we study.

View on arXiv
Comments on this paper