Abstract
Dimension reduction in semiparametric regressions includes construction of informative linear combinations and selection of contributing predictors. To reduce the predictor dimension in semiparametric regressions, we propose an ℓ1-minimization of sliced inverse regression with the Dantzig selector, and establish a non-asymptotic error bound for the resulting estimator. We also generalize the regularization concept to sliced inverse regression with an adaptive Dantzig selector. This ensures that all contributing predictors are selected with high probability, and that the resulting estimator is asymptotically normal even when the predictor dimension diverges to infinity. Numerical studies confirm our theoretical observations and demonstrate that our proposals are superior to existing estimators in terms of both dimension reduction and predictor selection.
Original language | English |
---|---|
Pages (from-to) | 641-654 |
Number of pages | 14 |
Journal | Biometrika |
Volume | 100 |
Issue number | 3 |
DOIs | |
Publication status | Published - Sept 2013 |
Scopus Subject Areas
- Statistics and Probability
- Mathematics(all)
- Agricultural and Biological Sciences (miscellaneous)
- Agricultural and Biological Sciences(all)
- Statistics, Probability and Uncertainty
- Applied Mathematics
User-Defined Keywords
- Dantzig selector
- Dimension reduction
- Sliced inverse regression
- Variable selection