188
111

Covariance Estimation in High Dimensions via Kronecker Product Expansions

Abstract

This paper presents a new method for estimating high dimensional covariance matrices. Our method, permuted rank-penalized least-squares (PRLS), is based on Kronecker product series expansions of the true covariance matrix. Assuming an i.i.d. Gaussian random sample, we establish high dimensional rates of convergence to the true covariance as both the number of samples and the number of variables go to infinity. For covariance matrices of low separation rank, our results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator. In addition, this framework captures a fundamental tradeoff between estimation error and approximation error, thus providing a scalable covariance estimation framework in terms of separation rank, an analog to low rank approximation of covariance matrices \cite{Lounici}. The MSE convergence rates generalize the high dimensional rates recently obtained for the ML Flip-flop algorithm \cite{KGlasso13, TsiligkaridisTSP}. We show that a class of block Toeplitz covariance matrices has low separation rank and show bounds on the minimal separation rank rr in order to obtain ϵ\epsilon-approximation error. Simulations are also presented to numerically evaluate the performance of the proposed estimator.

View on arXiv
Comments on this paper