Improved Algorithms for Kernel Matrix-Vector Multiplication Under Sparsity Assumptions

Motivated by the problem of fast processing of attention matrices, we study fast algorithms for computing matrix-vector products for asymmetric Gaussian Kernel matrices . 's columns are indexed by a set of keys , rows by a set of queries , and its entry is for some bandwidth parameter . Given a vector and error parameter , our task is to output a such that in time subquadratic in and linear in . Our algorithms rely on the following modelling assumption about the matrices : the sum of the entries of scales linearly in , as opposed to worst case quadratic growth. We validate this assumption experimentally, for Gaussian kernel matrices encountered in various settings such as fast attention computation in LLMs. We obtain the first subquadratic-time algorithm that works under this assumption, for unrestricted vectors.
View on arXiv