Abstract
In this paper we aim to estimate the direction in general single-index models and to select important variables simultaneously when a diverging number of predictors are involved in regressions. Towards this end, we propose the nonconcave penalized inverse regression method. Specifically, the resulting estimation with the SCAD penalty enjoys an oracle property in semi-parametric models even when the dimension, pn, of predictors goes to infinity. Under regularity conditions we also achieve the asymptotic normality when the dimension of predictor vector goes to infinity at the rate of pn = o (n1 / 3) where n is sample size, which enables us to construct confidence interval/region for the estimated index. The asymptotic results are augmented by simulations, and illustrated by analysis of an air pollution dataset.
Original language | English |
---|---|
Pages (from-to) | 862-875 |
Number of pages | 14 |
Journal | Journal of Multivariate Analysis |
Volume | 100 |
Issue number | 5 |
DOIs | |
Publication status | Published - May 2009 |
Scopus Subject Areas
- Statistics and Probability
- Numerical Analysis
- Statistics, Probability and Uncertainty
User-Defined Keywords
- 62G20
- 62H15
- Dimension reduction
- Diverging parameters
- Inverse regression
- SCAD
- Sparsity