Abstract
This paper proposes a novel prediction approach for a semi-functional linear model comprising a functional and a nonparametric component. The study establishes the minimax optimal rates of convergence for this model, revealing that the functional component can be learned with the same minimax rate as if the nonparametric component were known and vice versa. This result can be achieved by using a double-penalized least squares method to estimate both the functional and nonparametric components within the framework of reproducing kernel Hilbert spaces. Thanks to the representer theorem, the approach also offers other desirable features, including the algorithm efficiency requiring no iterations. We also provide numerical studies to demonstrate the effectiveness of the method and validate the theoretical analysis.
Original language | English |
---|---|
Article number | 2350031 |
Pages (from-to) | 467-505 |
Number of pages | 39 |
Journal | Analysis and Applications |
Volume | 22 |
Issue number | 3 |
Early online date | 21 Dec 2023 |
DOIs | |
Publication status | Published - Apr 2024 |
Scopus Subject Areas
- Analysis
- Applied Mathematics
User-Defined Keywords
- convergence rate
- Learning theory
- minimax optimality
- reproducing kernel Hilbert spaces
- semi-functional linear regression