Robust tensor train component analysis

Xiongjun Zhang*, Michael K. Ng

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

5 Citations (Scopus)

Abstract

Robust Principal Component Analysis plays a key role in various fields such as image and video processing, data mining, and hyperspectral data analysis. In this paper, we study the problem of robust tensor train (TT) principal component analysis from partial observations, which aims to decompose a given tensor into the low TT rank and sparse components. The decomposition of the proposed model is used to find the hidden factors and help alleviate the curse of dimensionality via a set of connected low-rank tensors. A relaxation model is to minimize a weighted combination of the sum of nuclear norms of unfolding matrices of core tensors and the tensor (Formula presented.) norm. A proximal alternating direction method of multipliers is developed to solve the resulting model. Furthermore, we show that any cluster point of the convergent subsequence is a Karush-Kuhn-Tucker point of the proposed model under some conditions. Extensive numerical examples on both synthetic data and real-world datasets are presented to demonstrate the effectiveness of the proposed approach.

Original languageEnglish
Article numbere2403
Number of pages26
JournalNumerical Linear Algebra with Applications
Volume29
Issue number1
DOIs
Publication statusPublished - Jan 2022

Scopus Subject Areas

  • Algebra and Number Theory
  • Applied Mathematics

User-Defined Keywords

  • Low-rank tensor
  • Nuclear norm
  • Proximal alternating direction method of multipliers
  • Robust principal component analysis
  • Tensor train decomposition

Fingerprint

Dive into the research topics of 'Robust tensor train component analysis'. Together they form a unique fingerprint.

Cite this