Abstract
We study benign overfitting in the setting of nonparametric regression under mean squared risk, and on the scale of Hölder classes. We construct a local polynomial estimator of the regression function that is minimax optimal on a Hölder class with any given smoothness, and that is a continuous function interpolating the set of observations with high probability. The key element of the construction is the use of singular kernels. Moreover, we prove that adaptation to unknown smoothness is compatible with benign overfitting. Namely, we construct a continuous and interpolating local polynomial estimator attaining the minimax optimal rate in L2 adaptively to the unknown Hölder smoothness. Our results highlight the fact that interpolation can be fundamentally decoupled from bias-variance tradeoff in the problem of nonparametric regression.
| Original language | English |
|---|---|
| Pages (from-to) | 949-980 |
| Number of pages | 32 |
| Journal | Probability Theory and Related Fields |
| Volume | 189 |
| Issue number | 3-4 |
| DOIs | |
| Publication status | Published - 1 Aug 2024 |
Keywords
- 62G08
- Adaptive estimator
- Aggregation
- Benign overfitting
- Interpolation
- Local polynomial estimators
- Nonparametric regression
- Singular kernel