Dimension reduction and predictor selection in semiparametric models

Zhou Yu, Liping Zhu, Heng PENG, Lixing ZHU

Research output: Contribution to journalJournal articlepeer-review

21 Citations (Scopus)


Dimension reduction in semiparametric regressions includes construction of informative linear combinations and selection of contributing predictors. To reduce the predictor dimension in semiparametric regressions, we propose an ℓ1-minimization of sliced inverse regression with the Dantzig selector, and establish a non-asymptotic error bound for the resulting estimator. We also generalize the regularization concept to sliced inverse regression with an adaptive Dantzig selector. This ensures that all contributing predictors are selected with high probability, and that the resulting estimator is asymptotically normal even when the predictor dimension diverges to infinity. Numerical studies confirm our theoretical observations and demonstrate that our proposals are superior to existing estimators in terms of both dimension reduction and predictor selection.

Original languageEnglish
Pages (from-to)641-654
Number of pages14
Issue number3
Publication statusPublished - Sept 2013

Scopus Subject Areas

  • Statistics and Probability
  • Mathematics(all)
  • Agricultural and Biological Sciences (miscellaneous)
  • Agricultural and Biological Sciences(all)
  • Statistics, Probability and Uncertainty
  • Applied Mathematics

User-Defined Keywords

  • Dantzig selector
  • Dimension reduction
  • Sliced inverse regression
  • Variable selection


Dive into the research topics of 'Dimension reduction and predictor selection in semiparametric models'. Together they form a unique fingerprint.

Cite this