Abstract
We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learning algorithm when an approximation of Rényi's entropy (of order 2) by Parzen windowing is minimized. This learning algorithm involves a Parzen windowing scaling parameter. We present a learning theory approach for this MEE algorithm in a regression setting when the scaling parameter is large. Consistency and explicit convergence rates are provided in terms of the approximation ability and capacity of the involved hypothesis space. Novel analysis is carried out for the generalization error associated with Rényi's entropy and a Parzen windowing function, to overcome technical difficulties arising from the essential differences between the classical least squares problems and the MEE setting. An involved symmetrized least squares error is introduced and analyzed, which is related to some ranking algorithms.
Original language | English |
---|---|
Pages (from-to) | 377–397 |
Number of pages | 21 |
Journal | Journal of Machine Learning Research |
Volume | 14 |
Issue number | 1 |
Publication status | Published - Jan 2013 |
User-Defined Keywords
- minimum error entropy
- approximation error
- empirical risk minimization
- Rényi's entropy
- learning theory