Approximation of smooth functionals using deep ReLU networks

Linhao Song*, Jun Fan, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

11 Citations (Scopus)

Abstract

In recent years, deep neural networks have been employed to approximate nonlinear continuous functionals F defined on Lp([−1,1]s) for 1≤p≤∞. However, the existing theoretical analysis in the literature either is unsatisfactory due to the poor approximation results, or does not apply to the rectified linear unit (ReLU) activation function. This paper aims to investigate the approximation power of functional deep ReLU networks in two settings: F is continuous with restrictions on the modulus of continuity, and F has higher order Fréchet derivatives. A novel functional network structure is proposed to extract features of higher order smoothness harbored by the target functional F. Quantitative rates of approximation in terms of the depth, width and total number of weights of neural networks are derived for both settings. We give logarithmic rates when measuring the approximation error on the unit ball of a Hölder space. In addition, we establish nearly polynomial rates (i.e., rates of the form exp−a(logM)b with a>0,0<b<1) when measuring the approximation error on a space of analytic functions.

Original languageEnglish
Pages (from-to)424-436
Number of pages13
JournalNeural Networks
Volume166
Early online date18 Jul 2023
DOIs
Publication statusPublished - Sept 2023

Scopus Subject Areas

  • Cognitive Neuroscience
  • Artificial Intelligence

User-Defined Keywords

  • Approximation theory
  • Deep learning theory
  • Fréchet derivative
  • Polynomial rates
  • ReLU
  • Smooth functionals

Fingerprint

Dive into the research topics of 'Approximation of smooth functionals using deep ReLU networks'. Together they form a unique fingerprint.

Cite this