Abstract
Sufficient dimension reduction is a body of theory and methods for reducing the dimensionality of predictors while preserving information on regressions. In this paper we propose a sparse dimension reduction method to perform interpretable dimension reduction. It is designed for situations in which the number of correlated predictors is very large relative to the sample size. The new procedure is based on the optimal scoring interpretation of the sliced inverse regression method. As a result, the regression framework of optimal scoring facilitates the use of commonly used regularization techniques. Simulation studies demonstrate the effectiveness and efficiency of the proposed approach.
Original language | English |
---|---|
Pages (from-to) | 223-232 |
Number of pages | 10 |
Journal | Computational Statistics and Data Analysis |
Volume | 57 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2013 |
Scopus Subject Areas
- Statistics and Probability
- Computational Mathematics
- Computational Theory and Mathematics
- Applied Mathematics
User-Defined Keywords
- High dimensionality
- Linear discriminant analysis
- Optimal scoring
- Sliced inverse regression
- Sparsity
- Sufficient dimension reduction