ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.04694
23
0

Two-sided Matrix Regression

8 March 2023
Nayel Bettache
C. Butucea
ArXivPDFHTML
Abstract

The two-sided matrix regression model Y=A∗XB∗+EY = A^*X B^* +EY=A∗XB∗+E aims at predicting YYY by taking into account both linear links between column features of XXX, via the unknown matrix B∗B^*B∗, and also among the row features of XXX, via the matrix A∗A^*A∗. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.

View on arXiv
Comments on this paper