Positive Low-Rank Tensor Completion
Motivated by combinatorial regression problems (which we interpret as low-rank tensor completion), we study noisy completion for positive tensors. Existing approaches convert this into matrix completion, but this is unable to achieve the best statistical rates possible. Here, we show that a specific class of low-rank tensors (namely those parametrized as continuous extensions of hierarchical log-linear models) are amenable to efficient computation (with appropriate choice of risk function) and lead to consistent estimation procedures in which hard-thresholding is used to estimate the low-rank structure in the tensor. Also, recent research has shown that approaches using different convex regularizers to exploit multiple sparse structures are unable to simultaneously exploit all structures; we show that combining hard- and soft-thresholding can provide one computationally tractable solution to this in the case of low-rank and sparse tensor completion. Numerical examples with synthetic data and data from a bioengineered metabolic network show that our estimation procedures are competitive with existing approaches to tensor completion.
View on arXiv