Résumé
The two-sided matrix regression model Y = A∗ XB∗ + E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B∗, and also among the row features of X, via the matrix A∗. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.
| langue originale | Anglais |
|---|---|
| Pages (de - à) | 1174-1198 |
| Nombre de pages | 25 |
| journal | Electronic Journal of Statistics |
| Volume | 19 |
| Numéro de publication | 1 |
| Les DOIs | |
| état | Publié - 1 janv. 2025 |
Empreinte digitale
Examiner les sujets de recherche de « Low-rank regularization in two-sided matrix regression ». Ensemble, ils forment une empreinte digitale unique.Contient cette citation
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver