Skip to main navigation Skip to search Skip to main content

Low-rank regularization in two-sided matrix regression

Research output: Contribution to journalArticlepeer-review

Abstract

The two-sided matrix regression model Y = A XB + E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B, and also among the row features of X, via the matrix A. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.

Original languageEnglish
Pages (from-to)1174-1198
Number of pages25
JournalElectronic Journal of Statistics
Volume19
Issue number1
DOIs
Publication statusPublished - 1 Jan 2025

Keywords

  • Matrix regression
  • multivariate response regression
  • nuclear norm penalized
  • oracle inequality
  • rank penalized
  • rank selection
  • two-sided matrix regression

Fingerprint

Dive into the research topics of 'Low-rank regularization in two-sided matrix regression'. Together they form a unique fingerprint.

Cite this