Abstract
Cheung and Xu (2001) has presented a dual structural recurrent radial basis function (RBF) network by considering the different scales in net's inputs and outputs. However, such a network implies that the underlying functional relationship between the net's inputs and outputs is linear separable, which may not be true from a practical viewpoint. In this paper, we therefore propose a new recurrent RBF network. It takes the net's input and the past outputs as an augmented input in analogy with the one in Billings and Fung (1995), but introduces a scale tuner into the net's hidden layer to balance the different scales between inputs and outputs. This network adaptively learns the parameters in the hidden layer together with those in the output layer. We implement this network by using a variant of extended normalized RBF (Cheung and Xu (2001)) with its hidden units learned by the rival penalization controlled competitive learning algorithm. The experiments have shown the outstanding performance of the proposed network in recursive function estimation.
Original language | English |
---|---|
Title of host publication | Proceedings of the 9th International Conference on Neural Information Processing |
Subtitle of host publication | Computational Intelligence for the E-Age |
Editors | Lipo Wang, Jagath C. Rajapakse, Kunihiko Fukushima, Soo Young Lee, Xin Yao |
Publisher | IEEE |
Pages | 1032-1036 |
Number of pages | 5 |
Volume | 2 |
ISBN (Electronic) | 9789810475246 |
ISBN (Print) | 9810475241 |
DOIs | |
Publication status | Published - Nov 2002 |
Externally published | Yes |
Event | 9th International Conference on Neural Information Processing, ICONIP 2002 - Singapore, Singapore Duration: 18 Nov 2002 → 22 Nov 2002 |
Conference
Conference | 9th International Conference on Neural Information Processing, ICONIP 2002 |
---|---|
Country/Territory | Singapore |
City | Singapore |
Period | 18/11/02 → 22/11/02 |
Scopus Subject Areas
- Computer Networks and Communications
- Information Systems
- Signal Processing