Locality constrained representation-based K-nearest neighbor classification.

Knowledge-Based Systems(2019)

引用 66|浏览65
暂无评分
摘要
K-nearest neighbor rule (KNN) is one of the most widely used methods in pattern recognition. However, the KNN-based classification performance is severely affected by the sensitivity of the neighborhood size k and the simple majority voting in the regions of k-neighborhoods, especially in the case of the small sample size with the existing outliers. To overcome the issues, we propose two locality constrained representation-based k-nearest neighbor rules with the purpose of further improving the KNN-based classification performance. The one is the weighted representation-based k-nearest neighbor rule (WRKNN). In the WRKNN, the test sample is represented as the linear combination of its k-nearest neighbors from each class, and the localities of k-nearest neighbors per class as the weights constrain their corresponding representation coefficients. Using the representation coefficients of k-nearest neighbors per class, the representation-based distance between the test sample and the class-specific k-nearest neighbors is calculated as the classification decision rule. The other one is the weighted local mean representation-based k-nearest neighbor rule (WLMRKNN). In the WLMRKNN, k-local mean vectors of k-nearest neighbors per class are first calculated and then used for representing the test sample. In the linear combination of the class-specific k-local mean vectors to represent the test sample, the localities of k-local mean vectors per class are considered as the weights to constrain the representation coefficients of k-local mean vectors. The representation coefficients are employed to design the classification decision rule which is the class-specific representation-based distance between the test sample and k-local mean vectors per class. To demonstrate the effectiveness of the proposed methods, we conduct extensive experiments on the UCI and UCR data sets and face databases, in comparisons with seven related competitive KNN-based methods. The experimental results show that the proposed methods perform better with less sensitiveness to k, especially in the small sample size cases.
更多
查看译文
关键词
K-nearest neighbor rule,Local mean vector,Representation-based distance,Pattern recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要