Artificial Intelligence XXXVI: 39th SGAI International Conference on Artificial Intelligence, AI 2019, Cambridge, UK, December 17–19, 2019, Proceedings

Max Bramer,Miltos Petridis

Artificial Intelligence XXXVI(2019)

引用 0|浏览2
暂无评分
摘要
In multi-label classification a datapoint can be labelled with more than one class at the same time. A common but trivial approach to multi-label classification is to train individual binary classifiers per label, but the performance can be improved by considering associations between the labels, and algorithms like classifier chains and RAKEL do this effectively. Like most machine learning algorithms, however, these approaches require accurate hyperparameter tuning, a computationally expensive optimisation problem. Tuning is important to train a good multi-label classifier model. There is a scarcity in the literature of effective multi-label classification approaches that do not require extensive hyperparameter tuning. This paper addresses this scarcity by proposing CascadeML, a multi-label classification approach based on cascade neural network that takes label associations into account and requires minimal hyperparameter tuning. The performance of the CasecadeML approach is evaluated using 10 multi-label datasets and compared with other leading multi-label classification algorithms. Results show that CascadeML performs comparatively with the leading approaches but without a need for hyperparameter tuning.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要