TY - JOUR

T1 - A study of local linear ridge regression estimators

AU - Deng, Wen Shuenn

AU - Chu, Chih Kang

AU - Cheng, Ming Yen

N1 - Funding information:
We thank the referees, the associate editor, and the editor for their many valuable comments which substantially improved the presentation. The research was supported by the National Science Council, Republic of China.
Publisher copyright:
© 2001 Elsevier Science B.V. All rights reserved.

PY - 2001/2

Y1 - 2001/2

N2 - In the case of the random design nonparametric regression, to correct for the unbounded finite-sample variance of the local linear estimator (LLE), Seifert and Gasser (J. Amer. Statist. Assoc. 91 (1996) 267–275) apply the idea of ridge regression to the LLE, and propose the local linear ridge regression estimator (LLRRE). However, the finite sample and the asymptotic properties of the LLRRE are not discussed there. In this paper, upper bounds of the finite-sample variance and bias of the LLRRE are obtained. It is shown that if the ridge regression parameters are not properly selected, then the resulting LLRRE has some drawbacks. For example, it may have a nonzero constant asymptotic bias, may suffer from boundary effects, or may be unable to share the nice asymptotic bias quality of the LLE. On the other hand, if the ridge regression parameters are properly selected, then the resulting LLRRE does not suffer from the above problems, and has the same asymptotic mean-square error as the LLE. For this purpose, the ridge regression parameters are allowed to depend on the sample size, and converge to 0 as the sample size increases. In practice, to select both the bandwidth and the ridge regression parameters, the idea of cross-validation is applied. Simulation studies demonstrate that the LLRRE using the cross-validated bandwidth and ridge regression parameters could have smaller sample mean integrated square error than the LLE using the cross-validated bandwidth, in reasonable sample sizes.

AB - In the case of the random design nonparametric regression, to correct for the unbounded finite-sample variance of the local linear estimator (LLE), Seifert and Gasser (J. Amer. Statist. Assoc. 91 (1996) 267–275) apply the idea of ridge regression to the LLE, and propose the local linear ridge regression estimator (LLRRE). However, the finite sample and the asymptotic properties of the LLRRE are not discussed there. In this paper, upper bounds of the finite-sample variance and bias of the LLRRE are obtained. It is shown that if the ridge regression parameters are not properly selected, then the resulting LLRRE has some drawbacks. For example, it may have a nonzero constant asymptotic bias, may suffer from boundary effects, or may be unable to share the nice asymptotic bias quality of the LLE. On the other hand, if the ridge regression parameters are properly selected, then the resulting LLRRE does not suffer from the above problems, and has the same asymptotic mean-square error as the LLE. For this purpose, the ridge regression parameters are allowed to depend on the sample size, and converge to 0 as the sample size increases. In practice, to select both the bandwidth and the ridge regression parameters, the idea of cross-validation is applied. Simulation studies demonstrate that the LLRRE using the cross-validated bandwidth and ridge regression parameters could have smaller sample mean integrated square error than the LLE using the cross-validated bandwidth, in reasonable sample sizes.

KW - Asymptotic behavior

KW - Boundary effect

KW - Finite-sample behavior

KW - Local linear ridge regression estimator

KW - Local linear estimator

KW - nonparametric regression

KW - Ridge regression

UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-0042193221&doi=10.1016%2fS0378-3758%2800%2900161-0&partnerID=40&md5=5fb8393940554af036c4cab5b2af03af

U2 - 10.1016/s0378-3758(00)00161-0

DO - 10.1016/s0378-3758(00)00161-0

M3 - Article

SN - 0378-3758

VL - 93

SP - 225

EP - 238

JO - Journal of Statistical Planning and Inference

JF - Journal of Statistical Planning and Inference

IS - 1–2

ER -