Abstract
The two-sided matrix regression model Y = A∗ XB∗ + E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B∗, and also among the row features of X, via the matrix A∗. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.
| Original language | English |
|---|---|
| Pages (from-to) | 1174-1198 |
| Number of pages | 25 |
| Journal | Electronic Journal of Statistics |
| Volume | 19 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 1 Jan 2025 |
Keywords
- Matrix regression
- multivariate response regression
- nuclear norm penalized
- oracle inequality
- rank penalized
- rank selection
- two-sided matrix regression
Fingerprint
Dive into the research topics of 'Low-rank regularization in two-sided matrix regression'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver