TY - JOUR
T1 - Construction of Neural Networks for Realization of Localized Deep Learning
AU - Chui, Charles K.
AU - Lin, Shao Bo
AU - Zhou, Ding Xuan
N1 - The research of CC is partially supported by U.S. ARO Grant W911NF-15-1-0385, Hong Kong Research Council (Grant No. 12300917), and Hong Kong Baptist University (Grant No. HKBU-RC-ICRS/16-17/03). The research of S-BL is partially supported by the National Natural Science Foundation of China (Grant No. 61502342). The work of D-XZ is supported partially by the Research Grants Council of Hong Kong [Project No. CityU 11303915] and by National Natural Science Foundation of China under Grant 11461161006.
PY - 2018/5/17
Y1 - 2018/5/17
N2 - The subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time series forecasting, and search engines. However, theoretical development of deep learning is still at its infancy. The objective of this paper is to introduce a deep neural network (also called deep-net) approach to localized manifold learning, with each hidden layer endowed with a specific learning task. For the purpose of illustrations, we only focus on deep-nets with three hidden layers, with the first layer for dimensionality reduction, the second layer for bias reduction, and the third layer for variance reduction. A feedback component is also designed to deal with outliers. The main theoretical result in this paper is the order (Formula presented.) of approximation of the regression function with regularity s, in terms of the number m of sample points, where the (unknown) manifold dimension d replaces the dimension D of the sampling (Euclidean) space for shallow nets.
AB - The subject of deep learning has recently attracted users of machine learning from various disciplines, including: medical diagnosis and bioinformatics, financial market analysis and online advertisement, speech and handwriting recognition, computer vision and natural language processing, time series forecasting, and search engines. However, theoretical development of deep learning is still at its infancy. The objective of this paper is to introduce a deep neural network (also called deep-net) approach to localized manifold learning, with each hidden layer endowed with a specific learning task. For the purpose of illustrations, we only focus on deep-nets with three hidden layers, with the first layer for dimensionality reduction, the second layer for bias reduction, and the third layer for variance reduction. A feedback component is also designed to deal with outliers. The main theoretical result in this paper is the order (Formula presented.) of approximation of the regression function with regularity s, in terms of the number m of sample points, where the (unknown) manifold dimension d replaces the dimension D of the sampling (Euclidean) space for shallow nets.
KW - deep learning
KW - deep nets
KW - feedback
KW - learning theory
KW - manifold learning
UR - http://www.scopus.com/inward/record.url?scp=85059325169&partnerID=8YFLogxK
U2 - 10.3389/fams.2018.00014
DO - 10.3389/fams.2018.00014
M3 - Journal article
AN - SCOPUS:85059325169
SN - 2297-4687
VL - 4
JO - Frontiers in Applied Mathematics and Statistics
JF - Frontiers in Applied Mathematics and Statistics
M1 - 14
ER -