Abstract
We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov–Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.
| Original language | English |
|---|---|
| Pages (from-to) | 1-6 |
| Number of pages | 6 |
| Journal | Neural Networks |
| Volume | 129 |
| DOIs | |
| Publication status | Published - 1 Sept 2020 |
| Externally published | Yes |
Keywords
- Approximation theory
- Curse of dimensionality
- Deep ReLU networks
- Kolmogorov–Arnold superposition theorem