Modal regression seeks the conditional mode of a response variable given a set of covariates, providing an alternative to mean regression and quantile regression in the presence of heavy-tailed noise, asymmetric noise or outliers. The goal of this project is to investigate theoretical properties of some kernel based modal regression schemes for large-scale and high-dimensional (even infinite dimensional) problems of learning theory. We shall first consider the large-scale problems by studying a regularized kernel modal regression scheme associated with varying Gaussian kernels and its combination with distributed learning scheme, localized approach or Nystroem subsampling method respectively, where consistency and robustness analysis for the induced learning algorithms will be carried out. We shall then study the impact of sparsity and robustness on generalization performance of sparse additive modal regression scheme in reproducing kernel Hilbert spaces (RKHS), which can be applied to high or ultra-high dimensional learning problems. An RKHS approach to functional linear modal regression will be studied, which allows covariates to be infinite dimensional functional data such as random curves or images. Learning rates will be derived for prediction error in terms of decay rates for eigenvalues of integral operators generated by reproducing kernel and covariance kernel. At last, the estimation of mode function based on sparse and irregularly sampled functional data will be considered. The mode function of random curves will preserve some important features that the mean function tends to average out of the data
|Effective start/end date||1/09/18 → 28/02/22|
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.