A class of learning algorithms for principal component analysis and minor component analysis

Qingfu Zhang*, Yiu Wing LEUNG

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

27 Citations (Scopus)

Abstract

Principal component analysis (PCA) and minor component analysis (MCA) are a powerful methodology for a wide variety of applications such as pattern recognition and signal processing. In this paper, we first propose a differential equation for the generalized eigenvalue problem. We prove that the stable points of this differential equation are the eigenvectors corresponding to the largest eigenvalue. Based on this generalized differential equation, a class of PCA and MCA learning algorithms can be obtained. We demonstrate that many existing PCA and MCA learning algorithms are special cases of this class, and this class includes some new and simpler MCA learning algorithms. Our results show that all the learning algorithms of this class have the same order of convergence speed, and they are robust to implementation error.

Original languageEnglish
Pages (from-to)529-533
Number of pages5
JournalIEEE Transactions on Neural Networks
Volume11
Issue number2
DOIs
Publication statusPublished - Mar 2000

Scopus Subject Areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

User-Defined Keywords

  • Eigenvalue problem
  • Learning algorithms
  • Minor component analysis
  • Principal component analysis

Fingerprint

Dive into the research topics of 'A class of learning algorithms for principal component analysis and minor component analysis'. Together they form a unique fingerprint.

Cite this