45
12

Sparse random tensors: concentration, regularization and applications

Abstract

We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-kk inhomogeneous random tensor TT with sparsity pmaxclognnk1p_{\max}\geq \frac{c\log n}{n^{k-1}}, we show that TET=O(npmax)\|T-\mathbb E T\|=O(\sqrt{np_{\max}}) with high probability. We also provide a simple way to regularize TT such that O(npmax)O(\sqrt{np_{\max}}) concentration still holds down to sparsity pmax>cnk1p_{\max}>\frac{c}{n^{k-1}}. Our proofs are based on the techniques of Friedman, Kahn and Szemer\'{e}di (1989), Feige and Ofek (2005), with the discrepancy theory of random hypergraphs. We also show that our concentration inequality is rate optimal in the minimax sense. We present our concentration and regularization results with three applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of the adjacency matrices for sparse random hypergraphs, and (iii) concentration of sparsified tensors under uniform sampling.

View on arXiv
Comments on this paper