Large dimensional predictors are often introduced in regressions to attenuate the possible modeling bias. We consider the stable direction recovery in single-index models in which we solely assume the response Y is independent of the diverging dimensional predictors X when β0TX is given, where β0 is a pn × 1 vector, and pn → ∞ as the sample size n → ∞. We first explore sufficient conditions under which the least squares estimation βn0 recovers the direction β0 consistently even when pn = o(√n). To enhance the model interpretability by excluding irrelevant predictors in regressions, we suggest an ℓ1-regularization algorithm with a quadratic constraint on magnitude of least squares residuals to search for a sparse estimation of β0. Not only can the solution βn of ℓ1-regularization recover β0 consistently, it also produces sufficiently sparse estimators which enable us to select important predictors to facilitate the model interpretation while maintaining the prediction accuracy. Further analysis by simulations and an application to the car price data suggest that our proposed estimation procedures have good finite-sample performance and are computationally efficient.
Scopus Subject Areas
- Diverging parameters
- Inverse regression
- Restricted orthonormality
- Sufficient dimension reduction