A sparse linear algebra algorithm for fast computation of prediction variances with Gaussian Markov random fields

Gaussian Markov random fields are used in a large number of disciplines in machine vision and spatial statistics. These models take advantage of sparsity in matrices introduced through the Markov assumptions, and all operations in inference and prediction use sparse linear algebra operations that scale well with the size of the state space. Yet this space can also be reasonably large, and computing predictive variances of linear combinations of variables, a common task in spatial prediction, is generally computationally prohibitive. Approximate methods (generally interpolation or conditional simulation) are typically used to circumvent this problem. Here we establish the conditions under which the variances of linear combinations of random variables can be directly computed using the Takahashi recursions. The ensuing computational simplification has wide applicability and may be used to enhance several software packages where model fitting is seated in a maximum-likelihood framework. We apply the result to several applications in spatial statistics, including LatticeKrig models, statistical downscaling, and fixed rank kriging, and show how the algorithm can compute hundreds of thousands of exact predictive variances of linear combinations on a standard desktop with ease, even when large spatial GMRF models are used.
View on arXiv