Abstract
Although the concept of sufficient dimension reduction that was originally proposed has been there for a long time, studies in the literature have largely focused on properties of estimators of dimension-reduction subspaces in the classical “small p, and large n” setting. Rather than the subspace, this paper considers directly the set of reduced predictors, which we believe are more relevant for subsequent analyses. A principled method is proposed for estimating a sparse reduction, which is based on a new, revised representation of an existing well-known method called the sliced inverse regression. A fast and efficient algorithm is developed for computing the estimator. The asymptotic behavior of the new method is studied when the number of predictors, p, exceeds the sample size, n, providing a guide for choosing the number of sufficient dimension-reduction predictors. Numerical results, including a simulation study and a cancer-drug-sensitivity data analysis, are presented to examine the performance.
Original language | English |
---|---|
Pages (from-to) | 33-46 |
Number of pages | 14 |
Journal | Statistics and Computing |
Volume | 28 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1 Jan 2018 |
Scopus Subject Areas
- Theoretical Computer Science
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Computational Theory and Mathematics
User-Defined Keywords
- Inverse modeling
- Model-free dimension reduction
- Sparsity