ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.09063
40
12
v1v2v3v4v5v6v7 (latest)

Sparse random tensors: concentration, regularization and applications

20 November 2019
Zhixin Zhou
Yizhe Zhu
ArXiv (abs)PDFHTML
Abstract

We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-kkk inhomogeneous random tensor TTT with sparsity pmax⁡≥clog⁡nnp_{\max}\geq \frac{c\log n}{n }pmax​≥nclogn​, we show that ∥T−ET∥=O(npmax⁡log⁡k−2(n))\|T-\mathbb E T\|=O(\sqrt{n p_{\max}}\log^{k-2}(n))∥T−ET∥=O(npmax​​logk−2(n)) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to pmax⁡≥clog⁡nnmp_{\max}\geq \frac{c\log n}{n^{m}}pmax​≥nmclogn​ with 1≤m≤k−11\leq m\leq k-11≤m≤k−1 and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize TTT such that O(nmpmax⁡)O(\sqrt{n^{m}p_{\max}})O(nmpmax​​) concentration still holds down to sparsity pmax⁡≥cnmp_{\max}\geq \frac{c}{n^{m}}pmax​≥nmc​ with k/2≤m≤k−1k/2\leq m\leq k-1k/2≤m≤k−1. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.

View on arXiv
Comments on this paper