Approximation of Nonlinear Functionals Using Deep ReLU Networks

Linhao Song, Jun Fan*, Di-Rong Chen, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

6 Citations (Scopus)

Abstract

In recent years, functional neural networks have been proposed and studied in order to approximate nonlinear continuous functionals defined on Lp([- 1 , 1] s) for integers s≥ 1 and 1 ≤ p< ∞ . However, their theoretical properties are largely unknown beyond universality of approximation or the existing analysis does not apply to the rectified linear unit (ReLU) activation function. To fill in this void, we investigate here the approximation power of functional deep neural networks associated with the ReLU activation function by constructing a continuous piecewise linear interpolation under a simple triangulation. In addition, we establish rates of approximation of the proposed functional deep ReLU networks under mild regularity conditions. Finally, our study may also shed some light on the understanding of functional data learning algorithms.

Original languageEnglish
Article number50
JournalJournal of Fourier Analysis and Applications
Volume29
Issue number4
Early online date28 Jul 2023
DOIs
Publication statusPublished - Aug 2023

Scopus Subject Areas

  • Analysis
  • Mathematics(all)
  • Applied Mathematics

User-Defined Keywords

  • Approximation theory
  • Deep learning theory
  • Functional neural networks
  • ReLU
  • Modulus of continuity

Fingerprint

Dive into the research topics of 'Approximation of Nonlinear Functionals Using Deep ReLU Networks'. Together they form a unique fingerprint.

Cite this