Generalized kernel-based inverse regression methods for sufficient dimension reduction

Chuanlong Xie, Lixing ZHU*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The linearity condition and the constant conditional variance assumption popularly used in sufficient dimension reduction are respectively close to elliptical symmetry and normality. However, it is always the concern about their restrictiveness. In this article, we give systematic studies to provide insight into the reasons why the popularly used sliced inverse regression and sliced average variance estimation need these conditions. Then we propose a new framework to relax these conditions and suggest generalized kernel-based inverse regression methods to handle a class of mixture multivariate unified skew-elliptical distributions.

Original languageEnglish
Article number106995
JournalComputational Statistics and Data Analysis
Volume150
DOIs
Publication statusPublished - Oct 2020

Scopus Subject Areas

  • Statistics and Probability
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

User-Defined Keywords

  • Stein's Lemma
  • Sufficient dimension reduction
  • Unified skew-elliptical distribution

Fingerprint

Dive into the research topics of 'Generalized kernel-based inverse regression methods for sufficient dimension reduction'. Together they form a unique fingerprint.

Cite this