TY - GEN
T1 - Proper inner product with mean displacement for Gaussian noise invariant ICA
AU - Song, Liyan
AU - LU, Haiping
N1 - Funding Information:
This work was supported by Research Grants Council of the Hong Kong SAR (Grant 12200915). Liyan is partially financially supported by FRG2/13-14/073. We thank Dr. James Voss for providing their codes and many helpful discussions.
PY - 2016/11
Y1 - 2016/11
N2 - Independent Component Analysis (ICA) is a classical method for Blind Source Separation (BSS). In this paper, we are interested in ICA in the presence of noise, i.e., the noisy ICA problem. Pseudo-Euclidean Gradient Iteration (PEGI) is a recent cumulant-based method that defines a pseudo Euclidean inner product to replace a quasi-whitening step in Gaussian noise invariant ICA. However, PEGI has two major limitations: 1) the pseudo Euclidean inner product is improper because it violates the positive definiteness of inner product; 2) the inner product matrix is orthogonal by design but it has gross errors or imperfections due to sample-based estimation. This paper proposes a new cumulant-based ICA method named as PIMD to address these two problems. We first define a Proper Inner product (PI) with proved positive definiteness and then relax the centering preprocessing step to a mean displacement (MD) step. Both PI and MD aim to improve the orthogonality of inner product matrix and the recovery of independent components (ICs) in sample-based estimation. We adopt a gradient iteration step to find the ICs for PIMD. Experiments on both synthetic and real data show the respective effectiveness of PI and MD as well as the superiority of PIMD over competing ICA methods. Moreover, MD can improve the performance of other ICA methods as well.
AB - Independent Component Analysis (ICA) is a classical method for Blind Source Separation (BSS). In this paper, we are interested in ICA in the presence of noise, i.e., the noisy ICA problem. Pseudo-Euclidean Gradient Iteration (PEGI) is a recent cumulant-based method that defines a pseudo Euclidean inner product to replace a quasi-whitening step in Gaussian noise invariant ICA. However, PEGI has two major limitations: 1) the pseudo Euclidean inner product is improper because it violates the positive definiteness of inner product; 2) the inner product matrix is orthogonal by design but it has gross errors or imperfections due to sample-based estimation. This paper proposes a new cumulant-based ICA method named as PIMD to address these two problems. We first define a Proper Inner product (PI) with proved positive definiteness and then relax the centering preprocessing step to a mean displacement (MD) step. Both PI and MD aim to improve the orthogonality of inner product matrix and the recovery of independent components (ICs) in sample-based estimation. We adopt a gradient iteration step to find the ICs for PIMD. Experiments on both synthetic and real data show the respective effectiveness of PI and MD as well as the superiority of PIMD over competing ICA methods. Moreover, MD can improve the performance of other ICA methods as well.
KW - Blind Source Separation
KW - Cumu- lants
KW - Inner Product
KW - Noisy Independent Component Analysis
KW - Pseudo-whitening
UR - https://proceedings.mlr.press/v63/Song106.html
UR - http://proceedings.mlr.press/v63/
UR - http://www.scopus.com/inward/record.url?scp=85071007891&partnerID=8YFLogxK
M3 - Conference proceeding
AN - SCOPUS:85071007891
T3 - Proceedings of Machine Learning Research
SP - 398
EP - 413
BT - Proceedings of The 8th Asian Conference on Machine Learning
PB - ML Research Press
T2 - 8th Asian Conference on Machine Learning, ACML 2016
Y2 - 16 November 2016 through 18 November 2016
ER -