TY - JOUR
T1 - Robust Corrupted Data Recovery and Clustering via Generalized Transformed Tensor Low-Rank Representation
AU - Yang, Jing Hua
AU - Chen, Chuan
AU - Dai, Hong Ning
AU - Ding, Meng
AU - Wu, Zhe Bin
AU - Zheng, Zibin
N1 - Funding information:
This work was supported in part by the Key-Area Research and Development Program of Guangdong Province under Grant 2020B010165003; in part by the National Natural Science Foundation of China under Grant 62176269 and Grant 12201522; in part by the Macao Science and Technology Development Fund, Macao Funding Scheme for Key Research and Development Projects, under Grant 0025/2019/AKP; and in part by the Innovative Research Foundation of Ship General Performance under Grant 25622112. (Corresponding authors: Chuan Chen; Hong-Ning Dai.)
PY - 2024/7
Y1 - 2024/7
N2 - Tensor analysis has received widespread attention in high-dimensional data learning. Unfortunately, the tensor data are often accompanied by arbitrary signal corruptions, including missing entries and sparse noise. How to recover the characteristics of the corrupted tensor data and make it compatible with the downstream clustering task remains a challenging problem. In this article, we study a generalized transformed tensor low-rank representation (TTLRR) model for simultaneously recovering and clustering the corrupted tensor data. The core idea is to find the latent low-rank tensor structure from the corrupted measurements using the transformed tensor singular value decomposition (SVD). Theoretically, we prove that TTLRR can recover the clean tensor data with a high probability guarantee under mild conditions. Furthermore, by using the transform adaptively learning from the data itself, the proposed TTLRR model can approximately represent and exploit the intrinsic subspace and seek out the cluster structure of the tensor data precisely. An effective algorithm is designed to solve the proposed model under the alternating direction method of multipliers (ADMMs) algorithm framework. The effectiveness and superiority of the proposed method against the compared methods are showcased over different tasks, including video/face data recovery and face/object/scene data clustering.
AB - Tensor analysis has received widespread attention in high-dimensional data learning. Unfortunately, the tensor data are often accompanied by arbitrary signal corruptions, including missing entries and sparse noise. How to recover the characteristics of the corrupted tensor data and make it compatible with the downstream clustering task remains a challenging problem. In this article, we study a generalized transformed tensor low-rank representation (TTLRR) model for simultaneously recovering and clustering the corrupted tensor data. The core idea is to find the latent low-rank tensor structure from the corrupted measurements using the transformed tensor singular value decomposition (SVD). Theoretically, we prove that TTLRR can recover the clean tensor data with a high probability guarantee under mild conditions. Furthermore, by using the transform adaptively learning from the data itself, the proposed TTLRR model can approximately represent and exploit the intrinsic subspace and seek out the cluster structure of the tensor data precisely. An effective algorithm is designed to solve the proposed model under the alternating direction method of multipliers (ADMMs) algorithm framework. The effectiveness and superiority of the proposed method against the compared methods are showcased over different tasks, including video/face data recovery and face/object/scene data clustering.
KW - Recoverability guarantee
KW - tensor data recovery
KW - tensor subspace clustering
KW - transformed tensor low-rank representation (TTLRR)
UR - http://www.scopus.com/inward/record.url?scp=85141620893&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2022.3215983
DO - 10.1109/TNNLS.2022.3215983
M3 - Journal article
C2 - 36327183
SN - 2162-237X
VL - 35
SP - 8839
EP - 8853
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 7
ER -