Efficient Minimax Clustering Probability Machine By Generalized Probability Product Kernel

2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8(2008)

引用 4|浏览23
暂无评分
摘要
Minimax Probability Machine (MPM), learning a decision function by minimizing the maximum probability of misclassification, has demonstrated very promising performance in classification and regression. However, MPM is often challenged for its slow training and test procedures. Aiming to solve this problem, we propose an efficient model named Minimax Clustering Probability Machine (MCPM). Following many traditional methods, we represent training data points by several clusters. Different from these methods, a Generalized Probability Product Kernel is appropriately defined to grasp the inner distributional information over the clusters. Incorporating clustering information via a non-linear kernel, MCPM can fast train and test in classification problem with promising performance. Another appealing property of the proposed approach is that MCPM can still derive an explicit worst-me accuracy bound for the decision boundary. Experimental results on synthetic and real data validate the effectiveness of MCPM for classification while attaining high accuracy.
更多
查看译文
关键词
gaussian distribution,machine learning,kernel,training data,accuracy,probability,support vector machines,optimization,testing,data validation,learning artificial intelligence,covariance matrix
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要