Skip to main navigation Skip to search Skip to main content

Estimating the Minimizer and the Minimum Value of a Regression Function under Passive Design

  • ENSAE
  • Eurecom

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a new method for estimating the minimizer x and the minimum value f of a smooth and strongly convex regression function f from the observations contaminated by random noise. Our estimator zn of the minimizer x is based on a version of the projected gradient descent with the gradient estimated by a regularized local polynomial algorithm. Next, we propose a two-stage procedure for estimation of the minimum value f of regression function f. At the first stage, we construct an accurate enough estimator of x, which can be, for example, zn. At the second stage, we estimate the function value at the point obtained in the first stage using a rate optimal nonparametric procedure. We derive non-asymptotic upper bounds for the quadratic risk and optimization risk of zn, and for the risk of estimating f. We establish minimax lower bounds showing that, under certain choice of parameters, the proposed algorithms achieve the minimax optimal rates of convergence on the class of smooth and strongly convex functions.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume25
Publication statusPublished - 1 Jan 2024
Externally publishedYes

Keywords

  • Local polynomial estimator
  • Minimax optimality
  • Nonparametric regression
  • Passive design
  • Stochastic optimization

Fingerprint

Dive into the research topics of 'Estimating the Minimizer and the Minimum Value of a Regression Function under Passive Design'. Together they form a unique fingerprint.

Cite this