TY - JOUR
T1 - Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization
AU - Qiu, Duo
AU - Bai, Minru
AU - Ng, Michael K.
AU - Zhang, Xiongjun
N1 - Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/5/7
Y1 - 2021/5/7
N2 - Robust low-rank tensor completion plays an important role in multidimensional data analysis against different degradations, such as Gaussian noise, sparse noise, and missing entries, and has a variety of applications in image processing and computer vision. In this paper, we investigate the problem of low-rank tensor completion with different degradations for third-order tensors, and propose a transformed tensor nuclear norm method combined the tensor ℓ1 norm with total variational (TV) regularization. Our model is based on a recently proposed algebraic framework in which the transformed tensor nuclear norm is introduced to capture lower transformed multi-rank by using suitable unitary transformations. We adopt the tensor ℓ1 norm to detect the sparse noise, and the TV regularization to preserve the piecewise smooth structure along the spatial and tubal dimensions. Moreover, a symmetric Gauss–Seidel based alternating direction method of multipliers is developed to solve the resulting model and its global convergence is established under very mild conditions. Extensive numerical examples on both hyperspectral images and video datasets are carried out to demonstrate the superiority of the proposed model compared with several existing state-of-the-art methods.
AB - Robust low-rank tensor completion plays an important role in multidimensional data analysis against different degradations, such as Gaussian noise, sparse noise, and missing entries, and has a variety of applications in image processing and computer vision. In this paper, we investigate the problem of low-rank tensor completion with different degradations for third-order tensors, and propose a transformed tensor nuclear norm method combined the tensor ℓ1 norm with total variational (TV) regularization. Our model is based on a recently proposed algebraic framework in which the transformed tensor nuclear norm is introduced to capture lower transformed multi-rank by using suitable unitary transformations. We adopt the tensor ℓ1 norm to detect the sparse noise, and the TV regularization to preserve the piecewise smooth structure along the spatial and tubal dimensions. Moreover, a symmetric Gauss–Seidel based alternating direction method of multipliers is developed to solve the resulting model and its global convergence is established under very mild conditions. Extensive numerical examples on both hyperspectral images and video datasets are carried out to demonstrate the superiority of the proposed model compared with several existing state-of-the-art methods.
KW - Low-rank tensor completion
KW - Mixed noise
KW - Total variation regularization
KW - Transformed tensor nuclear norm
UR - http://www.scopus.com/inward/record.url?scp=85100012930&partnerID=8YFLogxK
UR - https://www.sciencedirect.com/science/article/pii/S0925231220320221?via%3Dihub
U2 - 10.1016/j.neucom.2020.12.110
DO - 10.1016/j.neucom.2020.12.110
M3 - Journal article
AN - SCOPUS:85100012930
SN - 0925-2312
VL - 435
SP - 197
EP - 215
JO - Neurocomputing
JF - Neurocomputing
ER -