Noisy Non‐Negative Tucker Decomposition With Sparse Factors and Missing Data

Xiongjun Zhang*, Michael K. Ng

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

Tensor decomposition is a powerful tool for extracting physically meaningful latent factors from multi‐dimensional non‐negative data, and has been an increasing interest in a variety of fields such as image processing, machine learning, and computer vision. In this paper, we propose a sparse non‐negative Tucker decomposition and completion approach for the recovery of underlying non‐negative data under incompleted and generally noisy observations. Here the underlying non‐negative tensor data is decomposed into a core tensor and several factor matrices with all entries being non‐negative and the factor matrices being sparse. The loss function is derived by the maximum likelihood estimation of the noisy observations, and the norm is employed to enhance the sparsity of the factor matrices. We establish the error bound of the estimator of the proposed model under generic noise scenarios, which is then specified to the observations with additive Gaussian noise, additive Laplace noise, and Poisson observations, respectively. Our theoretical results are better than those by existing tensor‐based or matrix‐based methods. Moreover, the minimax lower bounds are shown to be matched with the derived upper bounds up to logarithmic factors. Numerical experiments on both synthetic data and real‐world data sets demonstrate the superiority of the proposed method for non‐negative tensor data completion.
Original languageEnglish
Article numbere70009
JournalNumerical Linear Algebra with Applications
Volume32
Issue number1
DOIs
Publication statusPublished - 3 Feb 2025

Fingerprint

Dive into the research topics of 'Noisy Non‐Negative Tucker Decomposition With Sparse Factors and Missing Data'. Together they form a unique fingerprint.

Cite this