TY - JOUR
T1 - Error-dependent smoothing rules in local linear regression
AU - Cheng, Ming Yen
AU - Hall, Peter
N1 - Publisher copyright:
© 2002 Institute of Statistical Science, Academia Sinica
PY - 2002/4
Y1 - 2002/4
N2 - We suggest an adaptive, error-dependent smoothing method for reducing the variance of local-linear curve estimators. It involves weighting the bandwidth used at the ith datum in proportion to a power of the absolute value of the ith residual. We show that the optimal power is 2/3. Arguing in this way, we prove that asymptotic variance can be reduced by 24% in the case of Normal errors, and by 35% for double-exponential errors. These results might appear to violate Jianqing Fan's bounds on performance of local-linear methods, but note that our approach to smoothing produces nonlinear estimators. In the case of Normal errors, our estimator has slightly better mean squared error performance than that suggested by Fan's minimax bound, calculated by him over all estimators, not just linear ones. However, these improvements are available only for single functions, not uniformly over Fan's function class. Even greater improvements in performance are achievable for error distributions with heavier tails. For symmetric error distributions the method has no first-order effect on bias, and existing bias-reduction techniques may be used in conjunction with error-dependent smoothing. In the case of asymmetric error distributions an overall reduction in mean squared error is achievable, involving a trade-off between bias and variance contributions. However, in this setting, the technique is relatively complex and probably not practically feasible.
AB - We suggest an adaptive, error-dependent smoothing method for reducing the variance of local-linear curve estimators. It involves weighting the bandwidth used at the ith datum in proportion to a power of the absolute value of the ith residual. We show that the optimal power is 2/3. Arguing in this way, we prove that asymptotic variance can be reduced by 24% in the case of Normal errors, and by 35% for double-exponential errors. These results might appear to violate Jianqing Fan's bounds on performance of local-linear methods, but note that our approach to smoothing produces nonlinear estimators. In the case of Normal errors, our estimator has slightly better mean squared error performance than that suggested by Fan's minimax bound, calculated by him over all estimators, not just linear ones. However, these improvements are available only for single functions, not uniformly over Fan's function class. Even greater improvements in performance are achievable for error distributions with heavier tails. For symmetric error distributions the method has no first-order effect on bias, and existing bias-reduction techniques may be used in conjunction with error-dependent smoothing. In the case of asymmetric error distributions an overall reduction in mean squared error is achievable, involving a trade-off between bias and variance contributions. However, in this setting, the technique is relatively complex and probably not practically feasible.
KW - Bandwidth
KW - kernel method
KW - nonparametric regression
KW - tail weight
KW - variance reduction
UR - http://www3.stat.sinica.edu.tw/statistica/j12n2/j12n24/j12n24.htm
UR - http://www3.stat.sinica.edu.tw/statistica/j12n2/12-2.htm
UR - http://www.scopus.com/inward/record.url?eid=2-s2.0-0036556654&partnerID=MN8TOARS
M3 - Journal article
SN - 1017-0405
VL - 12
SP - 429
EP - 447
JO - Statistica Sinica
JF - Statistica Sinica
IS - 2
ER -