A General Theory for Kernel Packets: from state space model to compactly supported basis

It is well known that the state space (SS) model formulation of a Gaussian process (GP) can lower its training and prediction time both to O(n) for n data points. We prove that an -dimensional SS model formulation of GP is equivalent to a concept we introduce as the general right Kernel Packet (KP): a transformation for the GP covariance function such that holds for any , 0 , and consecutive points , where denotes -th order derivative acting on . We extend this idea to the backward SS model formulation of the GP, leading to the concept of the left KP for next consecutive points: for any . By combining both left and right KPs, we can prove that a suitable linear combination of these covariance functions yields compactly supported KP functions: for any and . KPs further reduces the prediction time of GP to O(log n) or even O(1) and can be applied to more general problems involving the derivative of GPs.
View on arXiv