Simultaneous neural network approximation for smooth functions

Sean Hon*, Haizhao Yang

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

6 Citations (Scopus)

Abstract

We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. Our approximation results are nonasymptotic in the sense that the error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously with all involved constants explicitly determined. Namely, for f∈Cs([0,1]d), we show that deep ReLU networks of width O(NlogN) and of depth O(LlogL) can achieve a nonasymptotic approximation rate of O(N−2(s−1)/dL−2(s−1)/d) with respect to the W1,p([0,1]d) norm for p∈[1,∞). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width O(NlogN) and of depth O(LlogL) to approximate f∈Cs([0,1]d), the approximation rate is O(N−2(s−n)/dL−2(s−n)/d) with respect to the Wn,p([0,1]d) norm for p∈[1,∞).

Original languageEnglish
Pages (from-to)152-164
Number of pages13
JournalNeural Networks
Volume154
Early online date9 Jul 2022
DOIs
Publication statusPublished - Oct 2022

Scopus Subject Areas

  • Cognitive Neuroscience
  • Artificial Intelligence

User-Defined Keywords

  • Approximation theory
  • Deep neural networks
  • ReLU activation functions
  • Sobolev norm

Fingerprint

Dive into the research topics of 'Simultaneous neural network approximation for smooth functions'. Together they form a unique fingerprint.

Cite this