TY - JOUR
T1 - Tensor Completion via Complementary Global, Local, and Nonlocal Priors
AU - Zhao, Xi Le
AU - Yang, Jing Hua
AU - Ma, Tian Hui
AU - Jiang, Tai Xiang
AU - Ng, Michael K.
AU - Huang, Ting Zhu
N1 - This work was supported in part by the National Natural Science Foundation of China under Grant 61876203, Grant 61772003, Grant 11901450, Grant 12001446, and Grant 12171072; in part by the Key Project of Applied Basic Research in Sichuan Province under Grant 2020YJ0216; in part by the Applied Basic Research Project of Sichuan Province under Grant 2021YJ0107; in part by the Hong Kong Research Grant Council (HKRGC) through the General Research Fund (GRF) under Grant 12300218, Grant 12300519, Grant 17201020, and Grant 17300021; in part by the National Key Research and Development Program of China under Grant 2020YFA0714001; and in part by the Macao Science and Technology Development Fund through Macao Funding Scheme for Key Research and Development Projects under Grant 0025/2019/AKP.
Publisher Copyright:
© 1992-2012 IEEE.
PY - 2021/12/31
Y1 - 2021/12/31
N2 - Completing missing entries in multidimensional visual data is a typical ill-posed problem that requires appropriate exploitation of prior information of the underlying data. Commonly used priors can be roughly categorized into three classes: global tensor low-rankness, local properties, and nonlocal self-similarity (NSS); most existing works utilize one or two of them to implement completion. Naturally, there arises an interesting question: can one concurrently make use of multiple priors in a unified way, such that they can collaborate with each other to achieve better performance? This work gives a positive answer by formulating a novel tensor completion framework which can simultaneously take advantage of the global-local-nonlocal priors. In the proposed framework, the tensor train (TT) rank is adopted to characterize the global correlation; meanwhile, two Plug-and-Play (PnP) denoisers, including a convolutional neural network (CNN) denoiser and the color block-matching and 3 D filtering (CBM3D) denoiser, are incorporated to preserve local details and exploit NSS, respectively. Then, we design a proximal alternating minimization algorithm to efficiently solve this model under the PnP framework. Under mild conditions, we establish the convergence guarantee of the proposed algorithm. Extensive experiments show that these priors organically benefit from each other to achieve state-of-the-art performance both quantitatively and qualitatively.
AB - Completing missing entries in multidimensional visual data is a typical ill-posed problem that requires appropriate exploitation of prior information of the underlying data. Commonly used priors can be roughly categorized into three classes: global tensor low-rankness, local properties, and nonlocal self-similarity (NSS); most existing works utilize one or two of them to implement completion. Naturally, there arises an interesting question: can one concurrently make use of multiple priors in a unified way, such that they can collaborate with each other to achieve better performance? This work gives a positive answer by formulating a novel tensor completion framework which can simultaneously take advantage of the global-local-nonlocal priors. In the proposed framework, the tensor train (TT) rank is adopted to characterize the global correlation; meanwhile, two Plug-and-Play (PnP) denoisers, including a convolutional neural network (CNN) denoiser and the color block-matching and 3 D filtering (CBM3D) denoiser, are incorporated to preserve local details and exploit NSS, respectively. Then, we design a proximal alternating minimization algorithm to efficiently solve this model under the PnP framework. Under mild conditions, we establish the convergence guarantee of the proposed algorithm. Extensive experiments show that these priors organically benefit from each other to achieve state-of-the-art performance both quantitatively and qualitatively.
KW - Alternating direction method of multipliers
KW - Color block-matching and 3D filtering
KW - Convolutional neural network
KW - Plug-and-play
KW - Proximal alternating minimization
KW - Tensor train rank
UR - http://www.scopus.com/inward/record.url?scp=85122568632&partnerID=8YFLogxK
U2 - 10.1109/TIP.2021.3138325
DO - 10.1109/TIP.2021.3138325
M3 - Journal article
C2 - 34971534
AN - SCOPUS:85122568632
SN - 1057-7149
VL - 31
SP - 984
EP - 999
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
ER -