Assessment of the Effectiveness of Seven Biometric Feature Normalization Techniques

IEEE Transactions on Information Forensics and Security(2019)

引用 46|浏览9
暂无评分
摘要
The importance of normalizing biometric features or matching scores is understood in the multimodal biometric case, but there is less attention to the unimodal case. Prior reports assess the effectiveness of normalization directly on biometric performance. We propose that this process is logically comprised of two independent steps: (1) methods to equalize the effect of each biometric feature on the similarity scores calculated from all the features together and (2) methods of weighting the normalized features to optimize biometric performance. In this report, we address step 1 only and focus exclusively on normally distributed features. We show how differences in the variance of features lead to differences in the strength of the influence of each feature on the similarity scores produced from all the features. Since these differences in variance have nothing to do with importance in the biometric sense, it makes no sense to allow them to have greater weight in the assessment of biometric performance. We employed two types of features: (1) real eye-movement features and (2) synthetic features. We compare six variance normalization methods (histogram equalization, L1-normalization, median normalization, z-score normalization, min–max normalization, and L-infinite normalization) and one distance metric (Mahalanobis distance) in terms of how well they reduce the impact of the variance differences. The effectiveness of different techniques on real data depended on the strength of the inter-correlation of the features. For weakly correlated real features and synthetic features, histogram equalization was the best method followed by L1 normalization.
更多
查看译文
关键词
Histograms,Correlation,Task analysis,Feature extraction,Correlation coefficient,Visualization,Oscillators
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要