Convergence rates of regularized Huber regression under weak moment conditions

Hongzhi Tong*, Michael Ng

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

Huber regression plays a critical role in robust learning owing to its robustness to outliers and heavy-tailed noises. In this paper, we investigate the theoretical foundation of an Ivanov regularized Huber regression within a Reproducing Kernel Hilbert Space (RKHS) framework. By imposing a weak moment condition on the conditional distribution, we show that Huber regression estimator with the scale parameter adapted to sample size can learn the underlying regression function very well. Specifically, we establish the explicit convergence rates of the prediction error even when the regression function is outside the associated RKHS. Our statistical analysis could greatly improve the understanding of the learning behavior of Huber regression in the absence of light-tailed assumption on the conditional distribution.

Original languageEnglish
JournalAnalysis and Applications
DOIs
Publication statusE-pub ahead of print - 22 Nov 2024

Scopus Subject Areas

  • Analysis
  • Applied Mathematics

User-Defined Keywords

  • concentration estimates
  • Huber regression
  • Ivanov regularization
  • Reproducing Kernel Hilbert Space
  • robust learning

Fingerprint

Dive into the research topics of 'Convergence rates of regularized Huber regression under weak moment conditions'. Together they form a unique fingerprint.

Cite this