Estimating a sparse reduction for general regression in high dimensions

Tao Wang, Mengjie Chen, Hongyu Zhao, Lixing ZHU*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

17 Citations (Scopus)

Abstract

Although the concept of sufficient dimension reduction that was originally proposed has been there for a long time, studies in the literature have largely focused on properties of estimators of dimension-reduction subspaces in the classical “small p, and large n” setting. Rather than the subspace, this paper considers directly the set of reduced predictors, which we believe are more relevant for subsequent analyses. A principled method is proposed for estimating a sparse reduction, which is based on a new, revised representation of an existing well-known method called the sliced inverse regression. A fast and efficient algorithm is developed for computing the estimator. The asymptotic behavior of the new method is studied when the number of predictors, p, exceeds the sample size, n, providing a guide for choosing the number of sufficient dimension-reduction predictors. Numerical results, including a simulation study and a cancer-drug-sensitivity data analysis, are presented to examine the performance.

Original languageEnglish
Pages (from-to)33-46
Number of pages14
JournalStatistics and Computing
Volume28
Issue number1
DOIs
Publication statusPublished - 1 Jan 2018

Scopus Subject Areas

  • Theoretical Computer Science
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Computational Theory and Mathematics

User-Defined Keywords

  • Inverse modeling
  • Model-free dimension reduction
  • Sparsity

Fingerprint

Dive into the research topics of 'Estimating a sparse reduction for general regression in high dimensions'. Together they form a unique fingerprint.

Cite this