Abstract
In the era of big data, there are many data sets recorded in equal intervals of time. To model the change rate of such data, one often constructs a nonparametric regression model and then estimates the first derivative of the mean function. Along this direction, we propose a symmetric two-sided local constant regression for interior points, an asymmetric two-sided local polynomial regression for boundary points, and a one-sided local linear forecasting model for outside points. Specifically, under the framework of locally weighted least squares regression, we derive the asymptotic bias and variance of the proposed estimators, as well as establish their asymptotic normality. Moreover, to reduce the estimation bias for highly-oscillatory functions, we propose debiased estimators based on high-order polynomials and derive their corresponding kernel functions. A data-driven two-step procedure for simultaneous selection of the model and tuning parameters is also proposed. Finally, the usefulness of our proposed estimators is demonstrated by simulation studies and two real data examples.
Original language | English |
---|---|
Article number | 107781 |
Journal | Knowledge-Based Systems |
Volume | 236 |
DOIs | |
Publication status | Published - 25 Jan 2022 |
Scopus Subject Areas
- Management Information Systems
- Software
- Information Systems and Management
- Artificial Intelligence
User-Defined Keywords
- Boundary problem
- Differenced estimation
- Equally spaced design
- Kernel learning
- Nonparametric derivative estimation