14
237

Entrywise Eigenvector Analysis of Random Matrices with Low Expected Rank

Abstract

Recovering low-rank structures via eigenvector perturbation analysis is a common problem in statistical machine learning, such as in factor analysis, community detection, ranking, matrix completion, among others. While a large variety of bounds are available for average errors between empirical and population statistics of eigenvectors, few results are tight for entrywise analyses, which are critical for a number of problems such as community detection. This paper investigates entrywise behaviors of eigenvectors for a large class of random matrices whose expectations are low-rank, which helps settle the conjecture in Abbe et al. (2014b) that the spectral algorithm achieves exact recovery in the stochastic block model without any trimming or cleaning steps. The key is a first-order approximation of eigenvectors under the \ell_\infty norm: u_k \approx \frac{A u_k^*}{\lambda_k^*}, where {uk}\{u_k\} and {uk}\{u_k^*\} are eigenvectors of a random matrix AA and its expectation EA\mathbb{E} A, respectively. The fact that the approximation is both tight and linear in AA facilitates sharp comparisons between uku_k and uku_k^*. In particular, it allows for comparing the signs of uku_k and uku_k^* even if ukuk\| u_k - u_k^*\|_{\infty} is large. The results are further extended to perturbations of eigenspaces, yielding new \ell_\infty-type bounds for synchronization (Z2\mathbb{Z}_2-spiked Wigner model) and noisy matrix completion.

View on arXiv
Comments on this paper