Learning Theory Approach to Minimum Error Entropy Criterion

Ting Hu, Jun Fan, Qiang Wu, Ding Xuan Zhou

Research output: Contribution to journalJournal articlepeer-review

78 Citations (Scopus)

Abstract

We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learning algorithm when an approximation of Rényi's entropy (of order 2) by Parzen windowing is minimized. This learning algorithm involves a Parzen windowing scaling parameter. We present a learning theory approach for this MEE algorithm in a regression setting when the scaling parameter is large. Consistency and explicit convergence rates are provided in terms of the approximation ability and capacity of the involved hypothesis space. Novel analysis is carried out for the generalization error associated with Rényi's entropy and a Parzen windowing function, to overcome technical difficulties arising from the essential differences between the classical least squares problems and the MEE setting. An involved symmetrized least squares error is introduced and analyzed, which is related to some ranking algorithms.
Original languageEnglish
Pages (from-to)377–397
Number of pages21
JournalJournal of Machine Learning Research
Volume14
Issue number1
Publication statusPublished - Jan 2013

User-Defined Keywords

  • minimum error entropy
  • approximation error
  • empirical risk minimization
  • Rényi's entropy
  • learning theory

Fingerprint

Dive into the research topics of 'Learning Theory Approach to Minimum Error Entropy Criterion'. Together they form a unique fingerprint.

Cite this