Structured regularization for conditional Gaussian graphical models

Research output: Contribution to journalArticlepeer-review

Abstract

Conditional Gaussian graphical models are a reparametrization of the multivariate linear regression model which explicitly exhibits (i) the partial covariances between the predictors and the responses, and (ii) the partial covariances between the responses themselves. Such models are particularly suitable for interpretability since partial covariances describe direct relationships between variables. In this framework, we propose a regularization scheme to enhance the learning strategy of the model by driving the selection of the relevant input features by prior structural information. It comes with an efficient alternating optimization procedure which is guaranteed to converge to the global minimum. On top of showing competitive performance on artificial and real datasets, our method demonstrates capabilities for fine interpretation, as illustrated on three high-dimensional datasets from spectroscopy, genetics, and genomics.

Original languageEnglish
Pages (from-to)789-804
Number of pages16
JournalStatistics and Computing
Volume27
Issue number3
DOIs
Publication statusPublished - 1 May 2017
Externally publishedYes

Keywords

  • Conditional Gaussian graphical model
  • Multivariate regression
  • QTL study
  • Regularization
  • Regulatory motif
  • Sparsity
  • Spectroscopy
  • Structured elastic net

Fingerprint

Dive into the research topics of 'Structured regularization for conditional Gaussian graphical models'. Together they form a unique fingerprint.

Cite this