ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.00736
141
3
v1v2 (latest)

Tensor Estimation with Nearly Linear Samples Given Weak Side Information

1 July 2020
Chao Yu
ArXiv (abs)PDFHTML
Abstract

Tensor completion exhibits an interesting computational-statistical gap in terms of the number of samples needed to perform tensor estimation. While there are only Θ(tn)\Theta(tn)Θ(tn) degrees of freedom in a ttt-order tensor with ntn^tnt entries, the best known polynomial time algorithm requires O(nt/2)O(n^{t/2})O(nt/2) samples in order to guarantee consistent estimation. In this paper, we show that weak side information is sufficient to reduce the sample complexity to O(n)O(n)O(n). The side information consists of a weight vector for each of the modes which is not orthogonal to any of the latent factors along that mode; this is significantly weaker than assuming noisy knowledge of the subspaces. We provide an algorithm that utilizes this side information to produce a consistent estimator with O(n1+κ)O(n^{1+\kappa})O(n1+κ) samples for any small constant κ>0\kappa > 0κ>0.

View on arXiv
Comments on this paper