Abstract
Popular sparse estimation methods based on ℓ1-relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks - such as time series, random fields, inverse problems - for which the noise is rarely homoscedastic and its level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure.
| Original language | English |
|---|---|
| Pages | 1416-1424 |
| Number of pages | 9 |
| Publication status | Published - 1 Jan 2013 |
| Externally published | Yes |
| Event | 30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States Duration: 16 Jun 2013 → 21 Jun 2013 |
Conference
| Conference | 30th International Conference on Machine Learning, ICML 2013 |
|---|---|
| Country/Territory | United States |
| City | Atlanta, GA |
| Period | 16/06/13 → 21/06/13 |
Fingerprint
Dive into the research topics of 'Learning heteroscedastic models by convex programming under group sparsity'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver