We show that given an estimate A that is close to a general
high-rank positive semi-definite (PSD) matrix A in spectral norm (i.e.,
∥A−A∥2≤δ), the simple truncated SVD of A
produces a multiplicative approximation of A in Frobenius norm. This
observation leads to many interesting results on general high-rank matrix
estimation problems, which we briefly summarize below (A is an n×n
high-rank PSD matrix and Ak is the best rank-k approximation of A):
(1) High-rank matrix completion: By observing
Ω(σk+1(A)2nmax{ϵ−4,k2}μ02∥A∥F2logn) elements of A where σk+1(A) is
the (k+1)-th singular value of A and μ0 is the incoherence,
the truncated SVD on a zero-filled matrix satisfies ∥Ak−A∥F≤(1+O(ϵ))∥A−Ak∥F with high probability.
(2)High-rank matrix de-noising: Let A=A+E where E is a Gaussian
random noise matrix with zero mean and ν2/n variance on each entry. Then
the truncated SVD of A satisfies ∥Ak−A∥F≤(1+O(ν/σk+1(A)))∥A−Ak∥F+O(kν).
(3) Low-rank Estimation of high-dimensional covariance: Given N
i.i.d.~samples X1,⋯,XN∼Nn(0,A), can we estimate A with
a relative-error Frobenius norm bound? We show that if N=Ω(nmax{ϵ−4,k2}γk(A)2logN) for
γk(A)=σ1(A)/σk+1(A), then ∥Ak−A∥F≤(1+O(ϵ))∥A−Ak∥F with high probability, where
A=N1∑i=1NXiXi⊤ is the sample covariance.