293

High dimensional errors-in-variables models with dependent measurements

Abstract

We consider a parsimonious model for fitting observation data X=X0+WX = X_0 + W with two-way dependencies; that is, we use the signal matrix X0X_0 to explain column-wise dependency in XX, and the measurement error matrix WW to explain its row-wise dependency. In the matrix normal setting, we have the following representation where XX follows the matrix variate normal distribution with the Kronecker Sum covariance structure: vec{X}N(0,Σ){\rm vec}\{X\} \sim \mathcal{N}(0, \Sigma) where Σ=AB\Sigma = A \oplus B, which is generalized to the subgaussian settings as follows. Suppose that we observe yRfy \in {\bf R}^f and XRf×mX \in {\bf R}^{f \times m} in the following model: \begin{eqnarray*} y & = & X_0 \beta^* + \epsilon \\ X & = & X_0 + W \end{eqnarray*} where X0X_0 is a f×mf \times m design matrix with independent subgaussian row vectors, ϵRm\epsilon \in {\bf R}^m is a noise vector and WW is a mean zero f×mf \times m random noise matrix with independent subgaussian column vectors, independent of X0X_0 and ϵ\epsilon. This model is significantly different from those analyzed in the literature. Under sparsity and restrictive eigenvalue type of conditions, we show that one is able to recover a sparse vector βRm\beta^* \in {\bf R}^m from the following model given a single observation matrix XX and the response vector yy. We establish consistency in estimating β\beta^* and obtain the rates of convergence in the q\ell_q norm, where q=1,2q = 1, 2 for the Lasso-type estimator, and for q[1,2]q \in [1, 2] for a Dantzig-type conic programming estimator.

View on arXiv
Comments on this paper