Discriminative metric: Schatten norm vs. vector norm

ICPR(2012)

引用 37|浏览30
暂无评分
摘要
The notion of metric is fundamental for the study of pattern recognition and vector 2-norm ||·||2 is one of the most widely used metric, i.e., Euclidean distance. However, there is often the case that the inputs are matrices, e.g., 2D images in face recognition. Since a matrix can take more structure information than its vectorization, it is highly preferable to adopt the matrix representation of the original image rather than a simple vector. In this paper, we first propose a class of discriminative metrics for matrices, i.e., Schatten p-norm, by which we can better explain that with Euclidean metric, why the differences among facial images due to impact factors, e.g., illuminations, are more significant than differences due to identity variations. Second, we propose a novel Principal Component Analysis method based on Schatten 1-norm which can be easily extended to other subspace learning methods. Extensive experiments on Yale B, CMU PIE, ORL and AR databases prove the effectiveness of our method.
更多
查看译文
关键词
schatten p-norm,image representation,structure information,vector 2-norm,face recognition,pattern recognition,euclidean metric,principal component analysis method,yale b,2d images,learning (artificial intelligence),matrix representation,visual databases,matrix algebra,schatten 1-norm,yale b databases,subspace learning methods,vector norm,discriminative metrics,cmu pie databases,orl databases,ar databases,euclidean distance,principal component analysis,learning artificial intelligence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要