Dimension reduction for conditional variance in regressions

Li Ping Zhu*, Lixing ZHU

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

20 Citations (Scopus)

Abstract

Both the conditional mean and variance in regressions with high dimensional predictors are of importance in modeling. In this paper, we investigate estimation of the conditional variance. To attack the curse of dimensionality, we introduce a notion of central variance subspace (CVS) to capture the information contained in the conditional variance. To estimate the CVS, the impact from the conditional mean needs to be fully removed. To this end, a three-step procedure is proposed: Estimating exhaustively the CMS by an outer product gradient (OPG) method; estimating consistently the structural dimension of the CMS by a modified Bayesian information criterion (BIC); and estimating the conditional mean by a kernel smoother. After removing the conditional mean from the response, we suggest a squared residuals-based OPG method to identify the CVS. The asymptotic normality of candidate matrices, and hence of corresponding eigenvalues and eigenvectors, is obtained. Illustrative examples from simulation studies and a dataset are presented to assess the finite sample performance of the theoretical results.

Original languageEnglish
Pages (from-to)869-883
Number of pages15
JournalStatistica Sinica
Volume19
Issue number2
Publication statusPublished - Apr 2009

Scopus Subject Areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

User-Defined Keywords

  • Asymptotic normality
  • Central variance subspace
  • Dimension reduction
  • Heteroscedasticity
  • Outer product gradient

Fingerprint

Dive into the research topics of 'Dimension reduction for conditional variance in regressions'. Together they form a unique fingerprint.

Cite this