TY - JOUR
T1 - Noisy Non‐Negative Tucker Decomposition With Sparse Factors and Missing Data
AU - Zhang, Xiongjun
AU - Ng, Michael K.
N1 - The research of X. Zhang was supported in part by the National Natural Science Foundation of China under Grant No. 12171189, Hubei Provincial Natural Science Foundation of China under Grant No. JCZRYB202501474, and the Fundamental Research Funds for the Central Universities under Grant No. CCNU24ai002. The research of M. K. Ng was supported in part by the Hong Kong Research Grant Council GRF 12300218, 12300519, 17201020, 17300021, C1013-21GF, C7004-21GF and Joint NSFC-RGC N-HKU76921.
PY - 2025/2/3
Y1 - 2025/2/3
N2 - Tensor decomposition is a powerful tool for extracting physically meaningful latent factors from multi‐dimensional non‐negative data, and has been an increasing interest in a variety of fields such as image processing, machine learning, and computer vision. In this paper, we propose a sparse non‐negative Tucker decomposition and completion approach for the recovery of underlying non‐negative data under incompleted and generally noisy observations. Here the underlying non‐negative tensor data is decomposed into a core tensor and several factor matrices with all entries being non‐negative and the factor matrices being sparse. The loss function is derived by the maximum likelihood estimation of the noisy observations, and the norm is employed to enhance the sparsity of the factor matrices. We establish the error bound of the estimator of the proposed model under generic noise scenarios, which is then specified to the observations with additive Gaussian noise, additive Laplace noise, and Poisson observations, respectively. Our theoretical results are better than those by existing tensor‐based or matrix‐based methods. Moreover, the minimax lower bounds are shown to be matched with the derived upper bounds up to logarithmic factors. Numerical experiments on both synthetic data and real‐world data sets demonstrate the superiority of the proposed method for non‐negative tensor data completion.
AB - Tensor decomposition is a powerful tool for extracting physically meaningful latent factors from multi‐dimensional non‐negative data, and has been an increasing interest in a variety of fields such as image processing, machine learning, and computer vision. In this paper, we propose a sparse non‐negative Tucker decomposition and completion approach for the recovery of underlying non‐negative data under incompleted and generally noisy observations. Here the underlying non‐negative tensor data is decomposed into a core tensor and several factor matrices with all entries being non‐negative and the factor matrices being sparse. The loss function is derived by the maximum likelihood estimation of the noisy observations, and the norm is employed to enhance the sparsity of the factor matrices. We establish the error bound of the estimator of the proposed model under generic noise scenarios, which is then specified to the observations with additive Gaussian noise, additive Laplace noise, and Poisson observations, respectively. Our theoretical results are better than those by existing tensor‐based or matrix‐based methods. Moreover, the minimax lower bounds are shown to be matched with the derived upper bounds up to logarithmic factors. Numerical experiments on both synthetic data and real‐world data sets demonstrate the superiority of the proposed method for non‐negative tensor data completion.
UR - http://www.scopus.com/inward/record.url?scp=85217023244&partnerID=8YFLogxK
U2 - 10.1002/nla.70009
DO - 10.1002/nla.70009
M3 - Journal article
SN - 1070-5325
VL - 32
JO - Numerical Linear Algebra with Applications
JF - Numerical Linear Algebra with Applications
IS - 1
M1 - e70009
ER -