Abstract
In this paper, we study and analyze the regularized least squares for functional linear regression model. The approach is to use the reproducing kernel Hilbert space framework and the integral operators. We show with a more general and realistic assumption on the reproducing kernel and input data statistics that the rate of excess prediction risk by the regularized least squares is minimax optimal.
Original language | English |
---|---|
Pages (from-to) | 85-94 |
Number of pages | 10 |
Journal | Journal of Complexity |
Volume | 49 |
DOIs | |
Publication status | Published - Dec 2018 |
Scopus Subject Areas
- Algebra and Number Theory
- Statistics and Probability
- Numerical Analysis
- General Mathematics
- Control and Optimization
- Applied Mathematics
User-Defined Keywords
- Functional linear regression
- Learning rate
- Regularized least squares
- Reproducing kernel Hilbert space