GAP safe screening rules for sparse-group lasso

Research output: Contribution to journalConference articlepeer-review

Abstract

For statistical learning in high dimension, sparse regularizations have proven useful to boost both computational and statistical efficiency. In some contexts, it is natural to handle more refined structures than pure sparsity, such as for instance group sparsity. Sparse-Group Lasso has recently been introduced in the context of linear regression to enforce sparsity both at the feature and at the group level. We propose the first (provably) safe screening rules for Sparse-Group Lasso, i.e., rules that allow to discard early in the solver features/groups that are inactive at optimal solution. Thanks to efficient dual gap computations relying on the geometric properties of ∈-norm, safe screening rules for Sparse-Group Lasso lead to significant gains in term of computing time for our coordinate descent implementation.

Original languageEnglish
Pages (from-to)388-396
Number of pages9
JournalAdvances in Neural Information Processing Systems
Publication statusPublished - 1 Jan 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

Fingerprint

Dive into the research topics of 'GAP safe screening rules for sparse-group lasso'. Together they form a unique fingerprint.

Cite this