Abstract
We propose a new method for estimating the minimizer x∗ and the minimum value f∗ of a smooth and strongly convex regression function f from the observations contaminated by random noise. Our estimator zn of the minimizer x∗ is based on a version of the projected gradient descent with the gradient estimated by a regularized local polynomial algorithm. Next, we propose a two-stage procedure for estimation of the minimum value f∗ of regression function f. At the first stage, we construct an accurate enough estimator of x∗, which can be, for example, zn. At the second stage, we estimate the function value at the point obtained in the first stage using a rate optimal nonparametric procedure. We derive non-asymptotic upper bounds for the quadratic risk and optimization risk of zn, and for the risk of estimating f∗. We establish minimax lower bounds showing that, under certain choice of parameters, the proposed algorithms achieve the minimax optimal rates of convergence on the class of smooth and strongly convex functions.
| Original language | English |
|---|---|
| Journal | Journal of Machine Learning Research |
| Volume | 25 |
| Publication status | Published - 1 Jan 2024 |
| Externally published | Yes |
Keywords
- Local polynomial estimator
- Minimax optimality
- Nonparametric regression
- Passive design
- Stochastic optimization
Fingerprint
Dive into the research topics of 'Estimating the Minimizer and the Minimum Value of a Regression Function under Passive Design'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver