TY - JOUR
T1 - Nyström subsampling for functional linear regression
AU - Fan, Jun
AU - Liu, Jiading
AU - Shi, Lei
N1 - Funding Information:
The work by J. Fan is partially supported by the Research Grants Council of Hong Kong [Project No. 12302819 and 12303220], Guangdong Basic and Applied Basic Research Fund [Project No. 2024A1515011878], and Hong Kong Baptist University, Hong Kong [Project No. RC-FNRA-IG/22-23/SCI/02]. The work by L. Shi is partially supported by the National Natural Science Foundation of China (Grant No. 12171039). All authors contributed equally to this work and are listed alphabetically.
Publisher Copyright:
© 2025 Published by Elsevier Inc.
PY - 2025/4/16
Y1 - 2025/4/16
N2 - Kernel methods have proven to be highly effective for functional data analysis, demonstrating significant theoretical and practical success over the past two decades. However, their computational complexity and storage requirements hinder their direct application to large-scale functional data learning problems. In this paper, we address this limitation by investigating the theoretical properties of the Nyström subsampling method within the framework of the functional linear regression model and reproducing kernel Hilbert space. Our proposed algorithm not only overcomes the computational challenges but also achieves the minimax optimal rate of convergence for the excess prediction risk, provided an appropriate subsampling size is chosen. Our error analysis relies on the approximation of integral operators induced by the reproducing kernel and covariance function.
AB - Kernel methods have proven to be highly effective for functional data analysis, demonstrating significant theoretical and practical success over the past two decades. However, their computational complexity and storage requirements hinder their direct application to large-scale functional data learning problems. In this paper, we address this limitation by investigating the theoretical properties of the Nyström subsampling method within the framework of the functional linear regression model and reproducing kernel Hilbert space. Our proposed algorithm not only overcomes the computational challenges but also achieves the minimax optimal rate of convergence for the excess prediction risk, provided an appropriate subsampling size is chosen. Our error analysis relies on the approximation of integral operators induced by the reproducing kernel and covariance function.
KW - Functional linear regression
KW - Nyström subsampling
KW - Reproducing kernel Hilbert space
KW - Integral operator approximation
KW - Convergence analysis
KW - Minimax optimality
UR - http://www.scopus.com/inward/record.url?scp=105002885583&partnerID=8YFLogxK
U2 - 10.1016/j.jat.2025.106176
DO - 10.1016/j.jat.2025.106176
M3 - Journal article
AN - SCOPUS:105002885583
SN - 0021-9045
VL - 310
JO - Journal of Approximation Theory
JF - Journal of Approximation Theory
M1 - 106176
ER -