Sharp oracle bounds for monotone and convex regression through aggregation

Research output: Contribution to journalArticlepeer-review

Abstract

We derive oracle inequalities for the problems of isotonic and convex regression using the combination of Q-aggregation procedure and sparsity pattern aggregation. This improves upon the previous results including the oracle inequalities for the constrained least squares estimator. One of the improvements is that our oracle inequalities are sharp, i.e., with leading constant 1. It allows us to obtain bounds for the minimax regret thus accounting for model misspecification, which was not possible based on the previous results. Another improvement is that we obtain oracle inequalities both with high probability and in expectation.

Original languageEnglish
Pages (from-to)1879-1892
Number of pages14
JournalJournal of Machine Learning Research
Volume16
Publication statusPublished - 1 Sept 2015

Keywords

  • Aggregation
  • Convex regression
  • Isotonic regression
  • Minimax regret
  • Model misspecification
  • Shape constraints
  • Sharp oracle inequalities

Fingerprint

Dive into the research topics of 'Sharp oracle bounds for monotone and convex regression through aggregation'. Together they form a unique fingerprint.

Cite this