New Error Bounds for Deep ReLU Networks Using Sparse Grids

Research output: Contribution to journalArticlepeer-review

Abstract

We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks. We present new error estimates for which the curse of dimensionality is lessened by establishing a connection with sparse grids.

Original languageEnglish
Pages (from-to)78-92
Number of pages15
JournalSIAM Journal on Mathematics of Data Science
Volume1
Issue number1
DOIs
Publication statusPublished - 1 Jan 2019
Externally publishedYes

Keywords

  • approximation theory
  • curse of dimensionality
  • deep networks
  • machine learning
  • neural networks
  • sparse grids

Fingerprint

Dive into the research topics of 'New Error Bounds for Deep ReLU Networks Using Sparse Grids'. Together they form a unique fingerprint.

Cite this