Nonconcave penalized M-estimation with a diverging number of parameters

Gaorong Li*, Heng Peng, Lixing Zhu

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

94 Citations (Scopus)

Abstract

M-estimation is a widely used technique for robust statistical inference. In this paper, we investigate the asymptotic properties of a nonconcave penalized M-estimator in sparse, high-dimensional, linear regression models. Compared with classic M-estimation, the nonconcave penalized M-estimation method can perform parameter estimation and variable selection simultaneously. The proposed method is resistant to heavy-tailed errors or outliers in the response. We show that, under certain appropriate conditions, the nonconcave penalized M-estimator has the so-called "Oracle Property"; it is able to select variables consistently, and the estimators of nonzero coefficients have the same asymptotic distribution as they would if the zero coefficients were known in advance. We obtain consistency and asymptotic normality of the estimators when the dimension pn of the predictors satisfies the conditions pn log n/n → 0 and p2n/n → 0, respectively, where n is the sample size. Based on the idea of sure independence screening (SIS) and rank correlation, a robust rank SIS (RSIS) is introduced to deal with ultra-high dimensional data. Simulation studies were carried out to assess the performance of the proposed method for finite-sample cases, and a dataset was analyzed for illustration.

Original languageEnglish
Pages (from-to)391-419
Number of pages29
JournalStatistica Sinica
Volume21
Issue number1
Publication statusPublished - Jan 2011

Scopus Subject Areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

User-Defined Keywords

  • Linear model
  • Oracle property
  • Rank correlation
  • Robust estimation
  • SIS
  • Variable selection

Fingerprint

Dive into the research topics of 'Nonconcave penalized M-estimation with a diverging number of parameters'. Together they form a unique fingerprint.

Cite this