Inference for biased transformation models

Xuehu Zhu, Tao Wang, Junlong Zhao, Lixing ZHU*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Working regression models are often parsimonious for practical use and however may be biased. This is because either some strong signals to the response are not included in working models or too many weak signals are excluded in the modeling stage, which make cumulative bias. Thus, estimating consistently the parameters of interest in biased working models is then a challenge. This paper investigates the estimation problem for linear transformation models with three aims. First, to identify strong signals in the original full models, a sufficient dimension reduction approach is applied to transferring linear transformation models to pro forma linear models. This method can efficiently avoid high-dimensional nonparametric estimation for the unknown model transformation. Second, after identifying strong signals, a semiparametric re-modeling with some artificially constructed predictors is performed to correct model bias in working models. The construction procedure is introduced and a ridge ratio estimation is proposed to determine the number of these predictors. Third, root-n consistent estimators of the parameters in working models are defined and the asymptotic normality is proved. The performance of the new method is illustrated through simulation studies and a real data analysis.

Original languageEnglish
Pages (from-to)105-120
Number of pages16
JournalComputational Statistics and Data Analysis
Volume109
DOIs
Publication statusPublished - 1 May 2017

Scopus Subject Areas

  • Statistics and Probability
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

User-Defined Keywords

  • Estimation consistency
  • Linear transformation models
  • Model bias correction
  • Non-sparse structure

Fingerprint

Dive into the research topics of 'Inference for biased transformation models'. Together they form a unique fingerprint.

Cite this