An ordinal kernel trick for a computationally efficient support vector machine

Neural Networks(2014)

引用 5|浏览13
暂无评分
摘要
A principled approach to machine learning (ML) problems because of its mathematical foundations in statistical learning theory, support vector machines (SVM), a non-parametric method, require all the data to be available during the training phase. However, once the model parameters are identified, SVM relies only, for future prediction, on a subset of these training instances, called support vectors (SV). The SVM model is mathematically written as a weighted sum of these SV whose number, rather than the dimensionality of the input space, defines SVM's complexity. Since the final number of these SV can be up to half the size of the training dataset, SVM becomes challenged to run on energy aware computing platforms. We propose in this work Knee-Cut SVM (KCSVM) and Knee-Cut Ordinal Optimization inspired SVM (KCOOSVM) that use a soft trick of ordered kernel values and uniform subsampling to reduce SVM's prediction computational complexity while maintaining an acceptable impact on its generalization capability. When tested on several databases from UCL KCSVM and KCOOSVM produced promising results, comparable to similar published algorithms.
更多
查看译文
关键词
learning (artificial intelligence),optimisation,statistical analysis,support vector machines,KCOOSVM,KCSVM,knee-cut SVM,knee-cut ordinal optimization inspired SVM,machine learning,mathematical foundations,ordinal kernel trick,statistical learning theory,support vector machine,SVM,ordinal optimization,real time testing,sparse decision rules,supervised and binary classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要