Abstract
Conditional Gaussian graphical models are a reparametrization of the multivariate linear regression model which explicitly exhibits (i) the partial covariances between the predictors and the responses, and (ii) the partial covariances between the responses themselves. Such models are particularly suitable for interpretability since partial covariances describe direct relationships between variables. In this framework, we propose a regularization scheme to enhance the learning strategy of the model by driving the selection of the relevant input features by prior structural information. It comes with an efficient alternating optimization procedure which is guaranteed to converge to the global minimum. On top of showing competitive performance on artificial and real datasets, our method demonstrates capabilities for fine interpretation, as illustrated on three high-dimensional datasets from spectroscopy, genetics, and genomics.
| Original language | English |
|---|---|
| Pages (from-to) | 789-804 |
| Number of pages | 16 |
| Journal | Statistics and Computing |
| Volume | 27 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 1 May 2017 |
| Externally published | Yes |
Keywords
- Conditional Gaussian graphical model
- Multivariate regression
- QTL study
- Regularization
- Regulatory motif
- Sparsity
- Spectroscopy
- Structured elastic net
Fingerprint
Dive into the research topics of 'Structured regularization for conditional Gaussian graphical models'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver