A Dual-Based Pruning Method for the Least-Squares Support Vector Machine

Xiao-Lei Xia,Shang-Ming Zhou, Mingxing Ouyang, Dafang Xiang, Zhijun Zhang, Zexiang Zhou

IFAC PAPERSONLINE(2023)

引用 0|浏览3
暂无评分
摘要
The least-squares support vector machine (LS-SVM) is generally parameterized by a large number of support vectors, which slows down the speed of classification. This paper proposes to search for and prune two types of support vectors. The first type is the potential outliers, each of which is misclassified by the model trained on the other samples. The second type is the sample whose removal causes the least perturbation to the dual objective function. Without implicitly implementing the training procedure, the LS-SVM model pertaining to omission of a training sample is derived analytically from the LS-SVM trained on the whole training set. The derivation reduces the computational cost of pruning a sample, which makes the major technical contribution of this paper. Experimental results on six UCI datasets show that, compared with classical pruning methods, the proposed algorithm can enhance the sparsity of the LS-SVM significantly, while maintaining satisfactory generalization performances.
更多
查看译文
关键词
least-squares support vector machine,sparsity,pruning methods,dual form,the,method of Lagrange multipliers.
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要