An unknown by matrix is to be estimated from noisy measurements , where the noise matrix has i.i.d Gaussian entries. A popular matrix denoising scheme solves the nuclear norm penalization problem $\min_X || Y - X ||_F^2/2 + \lambda ||X||_* $, where $ ||X||_*$ denotes the nuclear norm (sum of singular values). This is the analog, for matrices, of penalization in the vector case. It has been empirically observed that, if has low rank, it may be recovered quite accurately from the noisy measurement . In a proportional growth framework where the rank , number of rows and number of columns all tend to proportionally to each other ($ r_n/m_n -> \rho$, ), we evaluate the asymptotic minimax MSE Our formulas involve incomplete moments of the quarter- and semi-circle laws (, square case) and the Mar\v{c}enko-Pastur law (, non square case). We also show that any least-favorable matrix has norm "at infinity". The nuclear norm penalization problem is solved by applying soft thresholding to the singular values of . We also derive the minimax threshold, namely the value which is the optimal place to threshold the singular values. All these results are obtained for general (non square, non symmetric) real matrices. Comparable results are obtained for square symmetric nonnegative- definite matrices.
View on arXiv