Abstract
Robust Principal Component Analysis plays a key role in various fields such as image and video processing, data mining, and hyperspectral data analysis. In this paper, we study the problem of robust tensor train (TT) principal component analysis from partial observations, which aims to decompose a given tensor into the low TT rank and sparse components. The decomposition of the proposed model is used to find the hidden factors and help alleviate the curse of dimensionality via a set of connected low-rank tensors. A relaxation model is to minimize a weighted combination of the sum of nuclear norms of unfolding matrices of core tensors and the tensor (Formula presented.) norm. A proximal alternating direction method of multipliers is developed to solve the resulting model. Furthermore, we show that any cluster point of the convergent subsequence is a Karush-Kuhn-Tucker point of the proposed model under some conditions. Extensive numerical examples on both synthetic data and real-world datasets are presented to demonstrate the effectiveness of the proposed approach.
Original language | English |
---|---|
Article number | e2403 |
Number of pages | 26 |
Journal | Numerical Linear Algebra with Applications |
Volume | 29 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2022 |
Scopus Subject Areas
- Algebra and Number Theory
- Applied Mathematics
User-Defined Keywords
- Low-rank tensor
- Nuclear norm
- Proximal alternating direction method of multipliers
- Robust principal component analysis
- Tensor train decomposition