We consider the problem of recovering a structured signal from noisy linear observations . The measurement matrix is modeled as , where is arbitrary and has independent sub-gaussian rows. By varying , and the sub-gaussian distribution of , this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with a large dynamic range. When the structure is given as a possibly non-convex cone , an approximate empirical risk minimizer is proven to be a robust estimator if the effective number of measurements is sufficient, even in the presence of a model mismatch. In classical compressed sensing with independent (sub-)gaussian measurements, one asks how many measurements are needed to recover ? In our setting, however, the effective number of measurements depends on the properties of . We show that the effective rank of may be used as a surrogate for the number of measurements, and if this exceeds the squared Gaussian mean width of , then accurate recovery is guaranteed. Furthermore, we examine the special case of generative priors in detail, that is when lies close to and is a Generative Neural Network (GNN) with ReLU activation functions. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Yilmaz arXiv:2001.10631. .
View on arXiv