On the shrinkage of local linear curve estimators

Ming Yen Cheng, Peter Hall, D. M. Titterington

Research output: Contribution to journalArticlepeer-review

Abstract

Local linear curve estimators are typically constructed using a compactly supported kernel, which minimizes edge effects and (in the case of the Epanechnikov kernel) optimizes asymptotic performance in a mean square sense. The use of compactly supported kernels can produce numerical problems, however. A common remedy is ‘ridging’, which may be viewed as shrinkage of the local linear estimator towards the origin. In this paper we propose a general form of shrinkage, and suggest that, in practice, shrinkage be towards a proper curve estimator. For the latter we propose a local linear estimator based on an infinitely supported kernel. This approach is resistant against selection of too large a shrinkage parameter, which can impair performance when shrinkage is towards the origin. It also removes problems of numerical instability resulting from using a compactly supported kernel, and enjoys very good mean squared error properties.
Original languageEnglish
Pages (from-to)11-17
Number of pages7
JournalStatistics and Computing
Volume7
Issue number1
DOIs
Publication statusPublished - Mar 1997
Externally publishedYes

User-Defined Keywords

  • Bandwidth
  • bias
  • compactly supported kernel
  • kernel estimator
  • mean squared error
  • ridge parameter
  • smoothing
  • variance

Fingerprint

Dive into the research topics of 'On the shrinkage of local linear curve estimators'. Together they form a unique fingerprint.

Cite this