TY - GEN
T1 - Semi-orthogonal multilinear PCA with relaxed start
AU - Shi, Qiquan
AU - LU, Haiping
N1 - Copyright:
Copyright 2015 Elsevier B.V., All rights reserved.
PY - 2015/7
Y1 - 2015/7
N2 - Principal component analysis (PCA) is an unsupervised method for learning low-dimensional features with orthogonal projections. Multilinear PCA methods extend PCA to deal with multidimensional data (tensors) directly via tensor-to-tensor projection or tensor-to-vector projection (TVP). However, under the TVP setting, it is difficult to develop an effective multilinear PCA method with the orthogonality constraint. This paper tackles this problem by proposing a novel Semi-Orthogonal Multilinear PCA (SO-MPCA) approach. SO-MPCA learns low-dimensional features directly from tensors via TVP by imposing the orthogonality constraint in only one mode. This formulation results in more captured variance and more learned features than full orthogonality. For better generalization, we further introduce a relaxed start (RS) strategy to get SO-MPCA-RS by fixing the starting projection vectors, which increases the bias and reduces the variance of the learning model. Experiments on both face (2D) and gait (3D) data demonstrate that SO-MPCA-RS outperforms other competing algorithms on the whole, and the relaxed start strategy is also effective for other TVP-based PCA methods.
AB - Principal component analysis (PCA) is an unsupervised method for learning low-dimensional features with orthogonal projections. Multilinear PCA methods extend PCA to deal with multidimensional data (tensors) directly via tensor-to-tensor projection or tensor-to-vector projection (TVP). However, under the TVP setting, it is difficult to develop an effective multilinear PCA method with the orthogonality constraint. This paper tackles this problem by proposing a novel Semi-Orthogonal Multilinear PCA (SO-MPCA) approach. SO-MPCA learns low-dimensional features directly from tensors via TVP by imposing the orthogonality constraint in only one mode. This formulation results in more captured variance and more learned features than full orthogonality. For better generalization, we further introduce a relaxed start (RS) strategy to get SO-MPCA-RS by fixing the starting projection vectors, which increases the bias and reduces the variance of the learning model. Experiments on both face (2D) and gait (3D) data demonstrate that SO-MPCA-RS outperforms other competing algorithms on the whole, and the relaxed start strategy is also effective for other TVP-based PCA methods.
UR - https://www.ijcai.org/proceedings/2015
UR - http://www.scopus.com/inward/record.url?scp=84949766534&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84949766534
T3 - IJCAI International Joint Conference on Artificial Intelligence
SP - 3805
EP - 3811
BT - IJCAI 2015 - Proceedings of the 24th International Joint Conference on Artificial Intelligence
A2 - Wooldridge, Michael
A2 - Yang, Qiang
PB - International Joint Conferences on Artificial Intelligence
T2 - 24th International Joint Conference on Artificial Intelligence, IJCAI 2015
Y2 - 25 July 2015 through 31 July 2015
ER -