Tensor train rank minimization with nonlocal self-similarity for tensor completion

Meng Ding, Ting Zhu Huang*, Xi Le Zhao*, Michael K. Ng, Tian Hui Ma

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

19 Citations (Scopus)

Abstract

The tensor train (TT) rank has received increasing attention in tensor completion due to its ability to capture the global correlation of high-order tensors (order > 3order > 3). For third order visual data, direct TT rank minimization has not exploited the potential of TT rank for high-order tensors. The TT rank minimization accompany with ket augmentation, which transforms a lower-order tensor (e.g., visual data) into a higher-order tensor, suffers from serious block-artifacts. To tackle this issue, we suggest the TT rank minimization with nonlocal self-similarity for tensor completion by simultaneously exploring the spatial, temporal/spectral, and nonlocal redundancy in visual data. More precisely, the TT rank minimization is performed on a formed higher-order tensor called group by stacking similar cubes, which naturally and fully takes advantage of the ability of TT rank for high-order tensors. Moreover, the perturbation analysis for the TT low-rankness of each group is established. We develop the alternating direction method of multipliers tailored for the specific structure to solve the proposed model. Extensive experiments demonstrate that the proposed method is superior to several existing state-of-the-art methods in terms of both qualitative and quantitative measures.

Original languageEnglish
Pages (from-to)475-498
Number of pages24
JournalInverse Problems and Imaging
Volume15
Issue number3
DOIs
Publication statusPublished - Jun 2021

Scopus Subject Areas

  • Analysis
  • Modelling and Simulation
  • Discrete Mathematics and Combinatorics
  • Control and Optimization

User-Defined Keywords

  • Alternating direction method of multipliers
  • Low-rank tensor completion
  • Nonlocal self-similarity
  • Tensor train rank

Cite this