Abstract
In this paper, we study regularized semi-supervised least squares regression with dependent samples. We analyze the regularized algorithm based on reproducing kernel Hilbert spaces, and show, with the use of unlabelled data that the regularized least squares algorithm can achieve the nearly minimax optimal learning rate with a logarithmic term for dependent samples. Our new results are better than existing results in the literature.
Original language | English |
---|---|
Pages (from-to) | 1347-1360 |
Number of pages | 14 |
Journal | Communications in Mathematical Sciences |
Volume | 16 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2018 |
Scopus Subject Areas
- Mathematics(all)
- Applied Mathematics
User-Defined Keywords
- Least squares regression
- Non-iid sampling
- Regularization
- Semi-supervised learning