TY - JOUR
T1 - Optimal Estimation of Derivatives in Nonparametric Regression
AU - Dai, Wenlin
AU - Tong, Tiejun
AU - Genton, Marc G.
N1 - Funding Information:
The authors thank the editor, the associate editor and the two referees for their constructive comments that led to a substantial improvement of the paper. The work of Wenlin Dai and Marc G. Genton was supported by King Abdullah University of Science and Technology (KAUST). Tiejun Tong's research was supported in part by Hong Kong Baptist University FRG grants FRG1/14-15/044, FRG2/15-16/038, FRG2/15-16/019 and FRG2/14-15/084.
PY - 2016/1
Y1 - 2016/1
N2 - We propose a simple framework for estimating derivatives without cutting the regression function in nonparametric regression. Unlike most existing methods that use the symmetric difference quotients, our method is constructed as a linear combination of observations. It is hence very flexible and applicable to both interior and boundary points, including most existing methods as special cases of ours. Within this framework, we define the variance-minimizing estimators for any order derivative of the regression function with a fixed bias-reduction level. For the equidistant design, we derive the asymptotic variance and bias of these estimators. We also show that our new method will, for the first time, achieve the asymptotically optimal convergence rate for difference-based estimators. Finally, we provide an effective criterion for selection of tuning parameters and demonstrate the usefulness of the proposed method through extensive simulation studies of the firstand second-order derivative estimators.
AB - We propose a simple framework for estimating derivatives without cutting the regression function in nonparametric regression. Unlike most existing methods that use the symmetric difference quotients, our method is constructed as a linear combination of observations. It is hence very flexible and applicable to both interior and boundary points, including most existing methods as special cases of ours. Within this framework, we define the variance-minimizing estimators for any order derivative of the regression function with a fixed bias-reduction level. For the equidistant design, we derive the asymptotic variance and bias of these estimators. We also show that our new method will, for the first time, achieve the asymptotically optimal convergence rate for difference-based estimators. Finally, we provide an effective criterion for selection of tuning parameters and demonstrate the usefulness of the proposed method through extensive simulation studies of the firstand second-order derivative estimators.
KW - Linear combination
KW - Nonparametric derivative estimation
KW - Nonparametric regression
KW - Optimal sequence
KW - Taylor expansion
UR - https://95.216.15.219/doi/10.5555/2946645.3053446?__cpo=aHR0cHM6Ly9kbC5hY20ub3Jn
UR - https://www.jmlr.org/papers/v17/15-640.html
UR - https://www.jmlr.org/papers/v17/
UR - http://www.scopus.com/inward/record.url?scp=84995480412&partnerID=8YFLogxK
M3 - Journal article
AN - SCOPUS:84995480412
SN - 1532-4435
VL - 17
SP - 5700
EP - 5724
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
IS - 1
ER -