Abstract
Multi-biometric feature-level fusion exploits feature information from more than one biometric source to improve recognition performance and template security. When ordered and unordered feature sets representing different biometric sources are involved, feature fusion becomes problematic. One way to mitigate this incompatibility problem is to transform the unordered extracted feature sets to ordered feature representation without sacrificing the discrimination power of the original features so that a feature fusion on ordered features can subsequently be applied. Existing unordered-to-ordered feature transformation methods are designed for three-dimensional minutiae point sets and are mostly not adaptable to high-dimensional feature input. This paper proposes a feature transformation scheme to learn a histogram representation from an unordered feature set. Our algorithm estimates the component-wise correspondences among the sample feature sets of each user and then learns a set of bins per user based on the distribution of the mutually-corresponding feature instances. Given the learnt bins, the histogram representation of a sample can be generated by concatenating the normalized frequency of unordered features falling into histogram bins. Experimental results on seven unimodal and three bimodal biometric databases show that our feature transformation scheme is able to preserve the discrimination power of the original features more promisingly than state-of-the-art transformation schemes.
Original language | English |
---|---|
Pages (from-to) | 706-719 |
Number of pages | 14 |
Journal | Pattern Recognition |
Volume | 60 |
DOIs | |
Publication status | Published - 1 Dec 2016 |
Scopus Subject Areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence
User-Defined Keywords
- Biometrics
- Face recognition
- Feature extraction
- Fingerprint recognition
- Histograms
- Learning