Passer à la navigation principale Passer à la recherche Passer au contenu principal

Low-rank regularization in two-sided matrix regression

Résultats de recherche: Contribution à un journalArticleRevue par des pairs

Résumé

The two-sided matrix regression model Y = A XB + E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B, and also among the row features of X, via the matrix A. We propose low-rank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rank-consistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.

langue originaleAnglais
Pages (de - à)1174-1198
Nombre de pages25
journalElectronic Journal of Statistics
Volume19
Numéro de publication1
Les DOIs
étatPublié - 1 janv. 2025

Empreinte digitale

Examiner les sujets de recherche de « Low-rank regularization in two-sided matrix regression ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation