Optimal prediction for kernel-based semi-functional linear regression

Keli Guo, Jun Fan*, Lixing Zhu

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

This paper proposes a novel prediction approach for a semi-functional linear model comprising a functional and a nonparametric component. The study establishes the minimax optimal rates of convergence for this model, revealing that the functional component can be learned with the same minimax rate as if the nonparametric component were known and vice versa. This result can be achieved by using a double-penalized least squares method to estimate both the functional and nonparametric components within the framework of reproducing kernel Hilbert spaces. Thanks to the representer theorem, the approach also offers other desirable features, including the algorithm efficiency requiring no iterations. We also provide numerical studies to demonstrate the effectiveness of the method and validate the theoretical analysis.

Original languageEnglish
Article number2350031
JournalAnalysis and Applications
DOIs
Publication statusE-pub ahead of print - 21 Dec 2023

Scopus Subject Areas

  • Analysis
  • Applied Mathematics

User-Defined Keywords

  • convergence rate
  • Learning theory
  • minimax optimality
  • reproducing kernel Hilbert spaces
  • semi-functional linear regression

Fingerprint

Dive into the research topics of 'Optimal prediction for kernel-based semi-functional linear regression'. Together they form a unique fingerprint.

Cite this