Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting.

ICML'14: Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32(2014)

引用 33|浏览67
暂无评分
摘要
Cost-sensitive multiclass classification has recently acquired significance in several applications, through the introduction of multi-class datasets with well-defined misclassification costs. The design of classification algorithms for this setting is considered. It is argued that the unreliable performance of current algorithms is due to the inability of the underlying loss functions to enforce a certain fundamental underlying property. This property, denoted guess-aversion, is that the loss should encourage correct classifications over the arbitrary guessing that ensues when all classes are equally scored by the classifier. While guess-aversion holds trivially for binary classification, this is not true in the multiclass setting. A new family of cost-sensitive guess-averse loss functions is derived, and used to design new cost-sensitive multiclass boosting algorithms, denoted GEL- and GLL-MCBoost. Extensive experiments demonstrate (1) the importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要