Abstract
Tensor analysis has received widespread attention in high-dimensional data learning. Unfortunately, the tensor data are often accompanied by arbitrary signal corruptions, including missing entries and sparse noise. How to recover the characteristics of the corrupted tensor data and make it compatible with the downstream clustering task remains a challenging problem. In this article, we study a generalized transformed tensor low-rank representation (TTLRR) model for simultaneously recovering and clustering the corrupted tensor data. The core idea is to find the latent low-rank tensor structure from the corrupted measurements using the transformed tensor singular value decomposition (SVD). Theoretically, we prove that TTLRR can recover the clean tensor data with a high probability guarantee under mild conditions. Furthermore, by using the transform adaptively learning from the data itself, the proposed TTLRR model can approximately represent and exploit the intrinsic subspace and seek out the cluster structure of the tensor data precisely. An effective algorithm is designed to solve the proposed model under the alternating direction method of multipliers (ADMMs) algorithm framework. The effectiveness and superiority of the proposed method against the compared methods are showcased over different tasks, including video/face data recovery and face/object/scene data clustering.
| Original language | English |
|---|---|
| Pages (from-to) | 8839-8853 |
| Number of pages | 15 |
| Journal | IEEE Transactions on Neural Networks and Learning Systems |
| Volume | 35 |
| Issue number | 7 |
| Early online date | 3 Nov 2022 |
| DOIs | |
| Publication status | Published - Jul 2024 |
User-Defined Keywords
- Recoverability guarantee
- tensor data recovery
- tensor subspace clustering
- transformed tensor low-rank representation (TTLRR)
Fingerprint
Dive into the research topics of 'Robust Corrupted Data Recovery and Clustering via Generalized Transformed Tensor Low-Rank Representation'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver