Sufficient dimension reduction in regressions through cumulative Hessian directions

Li Mei Zhang, Li Ping Zhu*, Lixing ZHU

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

2 Citations (Scopus)


To reduce the predictors dimension without loss of information on the regression, we develop in this paper a sufficient dimension reduction method which we term cumulative Hessian directions. Unlike many other existing sufficient dimension reduction methods, the estimation of our proposal avoids completely selecting the tuning parameters such as the number of slices in slicing estimation or the bandwidth in kernel smoothing. We also investigate the asymptotic properties of our proposal when the predictors dimension diverges. Illustrations through simulations and an application are presented to evidence the efficacy of our proposal and to compare it with existing methods.

Original languageEnglish
Pages (from-to)325-334
Number of pages10
JournalStatistics and Computing
Issue number3
Publication statusPublished - Jul 2011

Scopus Subject Areas

  • Theoretical Computer Science
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Computational Theory and Mathematics

User-Defined Keywords

  • Central subspace
  • Diverging parameters
  • Inverse regression
  • Sufficient dimension reduction


Dive into the research topics of 'Sufficient dimension reduction in regressions through cumulative Hessian directions'. Together they form a unique fingerprint.

Cite this