Abstract
Huber regression plays a critical role in robust learning owing to its robustness to outliers and heavy-tailed noises. In this paper, we investigate the theoretical foundation of an Ivanov regularized Huber regression within a Reproducing Kernel Hilbert Space (RKHS) framework. By imposing a weak moment condition on the conditional distribution, we show that Huber regression estimator with the scale parameter adapted to sample size can learn the underlying regression function very well. Specifically, we establish the explicit convergence rates of the prediction error even when the regression function is outside the associated RKHS. Our statistical analysis could greatly improve the understanding of the learning behavior of Huber regression in the absence of light-tailed assumption on the conditional distribution.
| Original language | English |
|---|---|
| Pages (from-to) | 867-885 |
| Number of pages | 19 |
| Journal | Analysis and Applications |
| Volume | 23 |
| Issue number | 5 |
| Early online date | 22 Nov 2024 |
| DOIs | |
| Publication status | Published - Jul 2025 |
User-Defined Keywords
- concentration estimates
- Huber regression
- Ivanov regularization
- Reproducing Kernel Hilbert Space
- robust learning