Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization

Duo Qiu, Minru Bai*, Michael K. Ng, Xiongjun Zhang

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

43 Citations (Scopus)

Abstract

Robust low-rank tensor completion plays an important role in multidimensional data analysis against different degradations, such as Gaussian noise, sparse noise, and missing entries, and has a variety of applications in image processing and computer vision. In this paper, we investigate the problem of low-rank tensor completion with different degradations for third-order tensors, and propose a transformed tensor nuclear norm method combined the tensor ℓ1 norm with total variational (TV) regularization. Our model is based on a recently proposed algebraic framework in which the transformed tensor nuclear norm is introduced to capture lower transformed multi-rank by using suitable unitary transformations. We adopt the tensor ℓ1 norm to detect the sparse noise, and the TV regularization to preserve the piecewise smooth structure along the spatial and tubal dimensions. Moreover, a symmetric Gauss–Seidel based alternating direction method of multipliers is developed to solve the resulting model and its global convergence is established under very mild conditions. Extensive numerical examples on both hyperspectral images and video datasets are carried out to demonstrate the superiority of the proposed model compared with several existing state-of-the-art methods.

Original languageEnglish
Pages (from-to)197-215
Number of pages19
JournalNeurocomputing
Volume435
Early online date9 Jan 2021
DOIs
Publication statusPublished - 7 May 2021

Scopus Subject Areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

User-Defined Keywords

  • Low-rank tensor completion
  • Mixed noise
  • Total variation regularization
  • Transformed tensor nuclear norm

Cite this