In this paper, we study and analyze the regularized least squares for function-on-function regression model. In our model, both the predictors (input data) and responses (output data) are multivariate functions (with d variables and d˜ variables respectively), and the model coefficient lies in a reproducing kernel Hilbert space (RKHS). We show under mild condition on the reproducing kernel and input data statistics that the convergence rate of excess prediction risk by the regularized least squares is minimax optimal. Numerical examples based on medical image analysis and atmospheric point spread function estimation are considered and tested, and the results demonstrate that the performance of the proposed model is comparable with that of other testing methods.
Scopus Subject Areas
- Applied Mathematics
- function-on-function regression
- integral operator
- non-asymptotic error bound
- Regularized least squares
- reproducing kernel Hilbert space