Abstract
This paper is concerned with sparse PCA via the matrix (2,1)-norm regularization (PCA2,1). It can produce a row-sparse projection, a useful tool in machine learning when it comes to, for example, feature selection, that aims to choose most relevant features. Mathematically, PCA2,1 is a non-smooth optimization problem on the Stiefel manifold. For a suitably chosen regularization parameter, the optimal projection matrix has many negligible rows. A practical NEPv approach (nonlinear eigenvalue problem with eigenvector dependency) is proposed to iteratively compute the optimal projection matrix. It is shown that the approach is globally convergent in the sense that the objective is monotonically increasing during the iterative process and any accumulation point of the iterates is a stationary point to the optimization problem. Extensive numerical experiments, with an application to feature selection, have been conducted to demonstrate the performance of the practical NEPv approach, with comparison against existing feature selection methods in terms of classification accuracy. The numerical results demonstrate that PCA2,1 is highly effective and often produces superior classification results to existing feature selection methods that are in use today.
| Original language | English |
|---|---|
| Article number | 100676 |
| Number of pages | 16 |
| Journal | Results in Applied Mathematics |
| Volume | 28 |
| DOIs | |
| Publication status | Published - Nov 2025 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 4 Quality Education
User-Defined Keywords
- Feature selection
- Matrix (2,1)-norm regularization
- NEPv
- PCA
- Row-sparse projection
- SCF
Fingerprint
Dive into the research topics of 'Sparse PCA via matrix (2,1)-norm regularization with an application to feature selection'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver