谷歌浏览器插件
订阅小程序
在清言上使用

Learning Optimization for Decision Tree Classification of Non-Categorical Data with Information Gain Impurity Criterion.

2014 International Joint Conference on Neural Networks (IJCNN)(2014)

引用 12|浏览14
暂无评分
摘要
We consider the problem of construction of decision trees in cases when data is non-categorical and is inherently high-dimensional. Using conventional tree growing algorithms that either rely on univariate splits or employ direct search methods for determining multivariate splitting conditions is computationally prohibitive. On the other hand application of standard optimization methods for finding locally optimal splitting conditions is obstructed by abundance of local minima and discontinuities of classical goodness functions such as e.g. information gain or Gini impurity. In order to avoid this limitation a method to generate smoothed replacement for measuring impurity of splits is proposed. This enables to use vast number of efficient optimization techniques for finding locally optimal splits and, at the same time, decreases the number of local minima. The approach is illustrated with examples.
更多
查看译文
关键词
decision trees,learning (artificial intelligence),optimisation,pattern classification,Gini impurity,classical goodness functions,conventional tree growing algorithms,direct search methods,high-dimensional data,information gain impurity criterion,learning optimization,multivariate splitting conditions,noncategorical data decision tree classification,split impurity,univariate splits
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要