A non-parametric approach to extending generic binary classifiers for multi-classification.

Pattern Recognition(2016)

引用 27|浏览21
暂无评分
摘要
Ensemble methods, which combine generic binary classifier scores to generate a multi-classification output, are commonly used in state-of-the-art computer vision and pattern recognition systems that rely on multi-classification. In particular, we consider the one-vs-one decomposition of the multi-class problem, where binary classifier models are trained to discriminate every class pair. We describe a robust multi-classification pipeline, which at a high level involves projecting binary classifier scores into compact orthogonal subspaces, followed by a non-linear probabilistic multi-classification step, using Kernel Density Estimation (KDE). We compare our approach against state-of-the-art ensemble methods (DCS, DRCW) on 16 multi-class datasets. We also compare against the most commonly used ensemble methods (VOTE, NEST) on 6 real-world computer vision datasets. Finally, we measure the statistical significance of our approach using non-parametric tests. Experimental results show that our approach gives a statistically significant improvement in multi-classification performance over state-of-the-art.
更多
查看译文
关键词
Multi-classification,Ensemble method,One-vs-one,Orthogonal subspace,Non-parametric density estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要