TY - JOUR
T1 - Probabilistic Rank-One Tensor Analysis with Concurrent Regularizations
AU - Zhou, Yang
AU - Lu, Haiping
AU - Cheung, Yiu Ming
N1 - This work was supported by the National Natural Science Foundation of China under Grant 61672444 and Grant 61272366.
PY - 2021/7
Y1 - 2021/7
N2 - Subspace learning for tensors attracts increasing interest in recent years, leading to the development of multilinear extensions of principal component analysis (PCA) and probabilistic PCA (PPCA). Existing multilinear PPCAs are based on the Tucker or CANDECOMP/PARAFAC (CP) models. Although both kinds of multilinear PPCAs have shown their effectiveness in dealing with tensors, they also have their own limitations. Tucker-based multilinear PPCAs have a restrictive subspace representation and suffer from rotational ambiguity, while CP-based ones are more prone to overfitting. To address these problems, we propose probabilistic rank-one tensor analysis (PROTA), a CP-based multilinear PPCA. PROTA has a more flexible subspace representation than Tucker-based PPCAs, and avoids rotational ambiguity. To alleviate overfitting for CP-based PPCAs, we propose two simple and effective regularization strategies, named as concurrent regularizations (CRs). By adjusting the noise variance or the moments of latent features, our strategies concurrently and coherently penalize the entire subspace. This relaxes unnecessary scale restrictions and gains more flexibility in regularizing CP-based PPCAs. To take full advantage of the probabilistic framework, we further propose a Bayesian treatment of PROTA, which achieves both automatic feature determination and robustness against overfitting. Experiments on synthetic and real-world datasets demonstrate the superiority of PROTA in subspace estimation and classification, as well as the effectiveness of CRs in alleviating overfitting.
AB - Subspace learning for tensors attracts increasing interest in recent years, leading to the development of multilinear extensions of principal component analysis (PCA) and probabilistic PCA (PPCA). Existing multilinear PPCAs are based on the Tucker or CANDECOMP/PARAFAC (CP) models. Although both kinds of multilinear PPCAs have shown their effectiveness in dealing with tensors, they also have their own limitations. Tucker-based multilinear PPCAs have a restrictive subspace representation and suffer from rotational ambiguity, while CP-based ones are more prone to overfitting. To address these problems, we propose probabilistic rank-one tensor analysis (PROTA), a CP-based multilinear PPCA. PROTA has a more flexible subspace representation than Tucker-based PPCAs, and avoids rotational ambiguity. To alleviate overfitting for CP-based PPCAs, we propose two simple and effective regularization strategies, named as concurrent regularizations (CRs). By adjusting the noise variance or the moments of latent features, our strategies concurrently and coherently penalize the entire subspace. This relaxes unnecessary scale restrictions and gains more flexibility in regularizing CP-based PPCAs. To take full advantage of the probabilistic framework, we further propose a Bayesian treatment of PROTA, which achieves both automatic feature determination and robustness against overfitting. Experiments on synthetic and real-world datasets demonstrate the superiority of PROTA in subspace estimation and classification, as well as the effectiveness of CRs in alleviating overfitting.
KW - Bayesian inference
KW - dimensionality reduction
KW - machine learning
KW - probabilistic PCA
KW - tensor analysis
UR - http://www.scopus.com/inward/record.url?scp=85109197541&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2019.2914316
DO - 10.1109/TCYB.2019.2914316
M3 - Journal article
C2 - 31107678
AN - SCOPUS:85109197541
SN - 2168-2267
VL - 51
SP - 3496
EP - 3509
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 7
ER -