Debiased learning and forecasting of first derivative

Wen Wu Wang*, Jun Lu, Tiejun Tong, Zhonghua Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In the era of big data, there are many data sets recorded in equal intervals of time. To model the change rate of such data, one often constructs a nonparametric regression model and then estimates the first derivative of the mean function. Along this direction, we propose a symmetric two-sided local constant regression for interior points, an asymmetric two-sided local polynomial regression for boundary points, and a one-sided local linear forecasting model for outside points. Specifically, under the framework of locally weighted least squares regression, we derive the asymptotic bias and variance of the proposed estimators, as well as establish their asymptotic normality. Moreover, to reduce the estimation bias for highly-oscillatory functions, we propose debiased estimators based on high-order polynomials and derive their corresponding kernel functions. A data-driven two-step procedure for simultaneous selection of the model and tuning parameters is also proposed. Finally, the usefulness of our proposed estimators is demonstrated by simulation studies and two real data examples.

Original languageEnglish
Article number107781
JournalKnowledge-Based Systems
Volume236
DOIs
Publication statusPublished - 25 Jan 2022

Scopus Subject Areas

  • Management Information Systems
  • Software
  • Information Systems and Management
  • Artificial Intelligence

User-Defined Keywords

  • Boundary problem
  • Differenced estimation
  • Equally spaced design
  • Kernel learning
  • Nonparametric derivative estimation

Fingerprint

Dive into the research topics of 'Debiased learning and forecasting of first derivative'. Together they form a unique fingerprint.

Cite this