Deep Net Tree Structure for Balance of Capacity and Approximation Ability

Charles Kam-Tai CHUI, Shao Bo Lin*, Ding Xuan Zhou

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Deep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of deep nets are two crucial factors in promoting the development of deep learning. In this paper, we propose a novel tree structure to equip deep nets to compensate the capacity drawback of deep fully connected neural networks (DFCN) and enhance the approximation ability of deep convolutional neural networks (DCNN). Based on an empirical risk minimization algorithm, we derive fast learning rates for deep nets.

Original languageEnglish
Article number46
JournalFrontiers in Applied Mathematics and Statistics
Volume5
DOIs
Publication statusPublished - 11 Sep 2019

Scopus Subject Areas

  • Applied Mathematics
  • Statistics and Probability

User-Defined Keywords

  • deep learning
  • deep nets
  • empirical risk minimization
  • learning theory
  • tree structure

Fingerprint

Dive into the research topics of 'Deep Net Tree Structure for Balance of Capacity and Approximation Ability'. Together they form a unique fingerprint.

Cite this