Forward Variable Selection for Sparse Ultra-High Dimensional Varying Coefficient Models

Ming-Yen Cheng, Toshio Honda, Jin-Ting Zhang

Research output: Contribution to journalJournal articlepeer-review

38 Citations (Scopus)

Abstract

Varying coefficient models have numerous applications in a wide scope of scientific areas. While enjoying nice interpretability, they also allow for flexibility in modeling dynamic impacts of the covariates. But, in the new era of big data, it is challenging to select the relevant variables when the dimensionality is very large. Recently, several works are focused on this important problem based on sparsity assumptions; they are subject to some limitations, however. We introduce an appealing forward selection procedure. It selects important variables sequentially according to a reduction in sum of squares criterion and it employs a Bayesian information criterion (BIC)-based stopping rule. Clearly, it is simple to implement and fast to compute, and possesses many other desirable properties from theoretical and numerical viewpoints. The BIC is a special case of the extended BIC (EBIC) when an extra tuning parameter in the latter vanishes. We establish rigorous screening consistency results when either BIC or EBIC is used as the stopping criterion. The theoretical results depend on some conditions on the eigenvalues related to the design matrices, which can be relaxed in some situations. Results of an extensive simulation study and a real data example are also presented to show the efficacy and usefulness of our procedure. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)1209-1221
JournalJournal of the American Statistical Association
Volume111
Issue number515
DOIs
Publication statusPublished - Jul 2016

Fingerprint

Dive into the research topics of 'Forward Variable Selection for Sparse Ultra-High Dimensional Varying Coefficient Models'. Together they form a unique fingerprint.

Cite this