Bagged One-Class Classifiers In The Presence Of Outliers

INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE(2013)

引用 7|浏览24
暂无评分
摘要
The problem of training classifiers only with target data arises in many applications where nontarget data are too costly, difficult to obtain, or not available at all. Several one-class classification methods have been presented to solve this problem, but most of the methods are highly sensitive to the presence of outliers in the target class. Ensemble methods have therefore been proposed as a powerful way to improve the classification performance of binary/multi-class learning algorithms by introducing diversity into classifiers. However, their application to one-class classification has been rather limited. In this paper, we present a new ensemble method based on a nonparametric weighted bagging strategy for one-class classification, to improve accuracy in the presence of outliers. While the standard bagging strategy assumes a uniform data distribution, the method we propose here estimates a probability density based on a forest structure of the data. This assumption allows the estimation of data distribution from the computation of simple univariate and bivariate kernel densities. Experiments using original and noisy versions of 20 different datasets show that bagging ensemble methods applied to different one-class classifiers outperform base one-class classification methods. Moreover, we show that, in noisy versions of the datasets, the nonparametric weighted bagging strategy we propose outperforms the classical bagging strategy in a statistically significant way.
更多
查看译文
关键词
One-class classifier, ensemble methods, bagging, outliers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要