Abstract
In this paper, a new modified hybrid learning algorithm for feedforward neural networks is proposed to obtain better generalization performance. For the sake of penalizing both the input-to-output mapping sensitivity and the high frequency components in training data, the first additional cost term and the second one are selected based on the first-order derivatives of the neural activation at the hidden layers and the second-order derivatives of the neural activation at the output layer, respectively. Finally, theoretical justifications and simulation results are given to verify the efficiency and effectiveness of our proposed learning algorithm.
Original language | English |
---|---|
Pages (from-to) | 572-577 |
Number of pages | 6 |
Journal | Lecture Notes in Computer Science |
Volume | 3496 |
Issue number | I |
DOIs | |
Publication status | Published - 2005 |
Event | Second International Symposium on Neural Networks: Advances in Neural Networks - ISNN 2005 - Chongqing, China Duration: 30 May 2005 → 1 Jun 2005 |
Scopus Subject Areas
- Theoretical Computer Science
- Computer Science(all)