Construction of Neural Networks for Realization of Localized Deep Learning

Charles K. Chui, Shao Bo Lin*, Ding Xuan Zhou

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

28 Citations (Scopus)

Abstract

The subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time series forecasting, and search engines. However, theoretical development of deep learning is still at its infancy. The objective of this paper is to introduce a deep neural network (also called deep-net) approach to localized manifold learning, with each hidden layer endowed with a specific learning task. For the purpose of illustrations, we only focus on deep-nets with three hidden layers, with the first layer for dimensionality reduction, the second layer for bias reduction, and the third layer for variance reduction. A feedback component is also designed to deal with outliers. The main theoretical result in this paper is the order (Formula presented.) of approximation of the regression function with regularity s, in terms of the number m of sample points, where the (unknown) manifold dimension d replaces the dimension D of the sampling (Euclidean) space for shallow nets.

Original languageEnglish
Article number14
JournalFrontiers in Applied Mathematics and Statistics
Volume4
DOIs
Publication statusPublished - 17 May 2018

Scopus Subject Areas

  • Applied Mathematics
  • Statistics and Probability

User-Defined Keywords

  • deep learning
  • deep nets
  • feedback
  • learning theory
  • manifold learning

Fingerprint

Dive into the research topics of 'Construction of Neural Networks for Realization of Localized Deep Learning'. Together they form a unique fingerprint.

Cite this