- 1-Support Vector Machines versus Boosting

msra(2006)

引用 22|浏览34
暂无评分
摘要
Support Vector Machines (SVMs) and Adaptive Boosting (AdaBoost) are two successful classification methods. They are essentially the same as they both try to maximize the minimal margin on a training set. In this work, we present an even platform to compare these two learning algorithms in terms of their test error, margin distribution and generalization power. Two basic models of polynomials and decision stumps are used to evaluate eight real-world binary datasets. We concluded that the generalization power of AdaBoost with linear SVMs as base learners and that of SVMs with decision stumps as kernels outperform other scenarios. Although the training error of AdaBoost approaches to zero with the increase number of weak learners, its test error starts to rise at a certain step. For both SVMs and AdaBoost, the cumulative margin distribution is indicative of the test error.
更多
查看译文
关键词
margin,adaboost,svm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要