Direct optimization of margins improves generalization in combined classifiers

Proceedings of the 1998 conference on Advances in neural information processing systems II(1998)

引用 69|浏览40
暂无评分
摘要
00 1Cumulative training margin distributionsfor AdaBoost versusour "Direct Optimization OfMargins" (DOOM) algorithm.The dark curve is AdaBoost, thelight curve is DOOM. DOOMsacrifices significant training errorfor improved test error (horizontalmarks on margin= 0 line).1 IntroductionMany learning algorithms for pattern classification minimize some cost function ofthe training data, with the aim of minimizing error (the probability of misclassifyingan example). One example of such ...
更多
查看译文
关键词
combined classifier,direct optimization,cost function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要