TY - JOUR
T1 - Robust estimation of derivatives using locally weighted least absolute deviation regression
AU - Wang, Wen Wu
AU - Yu, Ping
AU - Lin, Lu
AU - Tong, Tiejun
N1 - Funding Information:
We would like to thank two anonymous reviewers and action editor for their constructive comments on improving the quality of the paper. Wang’s work was supported by Qufu Normal University, University of Hong Kong, Hong Kong Baptist University. Lin’s work was supported by NNSF projects of China.
PY - 2019/1
Y1 - 2019/1
N2 - In nonparametric regression, the derivative estimation has attracted much attention in recent years due to its wide applications. In this paper, we propose a new method for the derivative estimation using the locally weighted least absolute deviation regression. Different from the local polynomial regression, the proposed method does not require a finite variance for the error term and so is robust to the presence of heavy-tailed errors. Meanwhile, it does not require a zero median or a positive density at zero for the error term in comparison with the local median regression. We further show that the proposed estimator with random difference is asymptotically equivalent to the (infinitely) composite quantile regression estimator. In other words, running one regression is equivalent to combining infinitely many quantile regressions. In addition, the proposed method is also extended to estimate the derivatives at the boundaries and to estimate higher-order derivatives. For the equidistant design, we derive theoretical results for the proposed estimators, including the asymptotic bias and variance, consistency, and asymptotic normality. Finally, we conduct simulation studies to demonstrate that the proposed method has better performance than the existing methods in the presence of outliers and heavy-tailed errors, and analyze the Chinese house price data for the past ten years to illustrate the usefulness of the proposed method.
AB - In nonparametric regression, the derivative estimation has attracted much attention in recent years due to its wide applications. In this paper, we propose a new method for the derivative estimation using the locally weighted least absolute deviation regression. Different from the local polynomial regression, the proposed method does not require a finite variance for the error term and so is robust to the presence of heavy-tailed errors. Meanwhile, it does not require a zero median or a positive density at zero for the error term in comparison with the local median regression. We further show that the proposed estimator with random difference is asymptotically equivalent to the (infinitely) composite quantile regression estimator. In other words, running one regression is equivalent to combining infinitely many quantile regressions. In addition, the proposed method is also extended to estimate the derivatives at the boundaries and to estimate higher-order derivatives. For the equidistant design, we derive theoretical results for the proposed estimators, including the asymptotic bias and variance, consistency, and asymptotic normality. Finally, we conduct simulation studies to demonstrate that the proposed method has better performance than the existing methods in the presence of outliers and heavy-tailed errors, and analyze the Chinese house price data for the past ten years to illustrate the usefulness of the proposed method.
KW - Composite quantile regression
KW - Differenced method
KW - LowLAD
KW - LowLSR
KW - Outlier and heavy-tailed error
KW - Robust nonparametric derivative estimation
UR - https://95.216.15.219/doi/10.5555/3322706.3362001?__cpo=aHR0cHM6Ly9kbC5hY20ub3Jn
UR - https://www.jmlr.org/papers/v20/17-340.html
UR - https://www.jmlr.org/papers/v20/
UR - http://www.scopus.com/inward/record.url?scp=85072643037&partnerID=8YFLogxK
M3 - Journal article
AN - SCOPUS:85072643037
SN - 1532-4435
VL - 20
SP - 2157
EP - 2205
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
IS - 1
ER -