Abstract
When applying independent component analysis (ICA), sometimes we expect the connections between the observed mixtures and the recovered independent components (or the original sources) to be sparse, to make the interpretation easier or to reduce the random effect in the results. In this paper we propose two methods to tackle this problem. One is based on adaptive Lasso, which exploits the L1 penalty with data-adaptive weights. We show the relationship between this method and the classic information criteria such as BIC and AIC. The other is based on optimal brain surgeon, and we show how its stopping criterion is related to the information criteria. This method produces the solution path of the transformation matrix, with different number of zero entries. These methods involve low computational loads. Moreover, in each method, the parameter controlling the sparsity level of the transformation matrix has clear interpretations. By setting such parameters to certain values, the results of the proposed methods are consistent with those produced by classic information criteria.
Original language | English |
---|---|
Pages (from-to) | 195-202 |
Number of pages | 8 |
Journal | Lecture Notes in Computer Science |
Volume | 5441 |
DOIs | |
Publication status | Published - 2009 |
Event | 8th International Conference on Independent Component Analysis and Signal Separation, ICA 2009 - Paraty, Brazil Duration: 15 Mar 2009 → 18 Mar 2009 |
Scopus Subject Areas
- Theoretical Computer Science
- Computer Science(all)