Abstract
We consider the problem of combining a (possibly uncountably infinite) set of affine estimators in non-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leads to sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such as least square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As a consequence, we show that the proposed aggregate provides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuning parameters nor splitting the set of observations. We also illustrate numerically the good performance achieved by the exponentially weighted aggregate.
| Original language | English |
|---|---|
| Pages (from-to) | 635-660 |
| Number of pages | 26 |
| Journal | Journal of Machine Learning Research |
| Volume | 19 |
| Publication status | Published - 1 Jan 2011 |
| Event | 24th International Conference on Learning Theory, COLT 2011 co located with Foundation of Computational Mathematics Conference, FOCM 2011 - Budapest, Hungary Duration: 9 Jul 2011 → 11 Jul 2011 |
Keywords
- List of keywords
Fingerprint
Dive into the research topics of 'Optimal aggregation of affine estimators'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver