Optimal feasible step-size based working set selection for large scale SVMs training
Neurocomputing(2020)
摘要
Efficient training of support vector machines (SVMs) with large-scale samples is of crucial importance in the era of big data. Sequential minimal optimization (SMO) is considered as an effective solution to this challenging task, and the working set selection is one of the key steps in SMO. Various strategies have been developed and implemented for working set selection in LibSVM and Shark. In this work we point out that the algorithm used in LibSVM does not maintain the box-constraints which, nevertheless, are very important for evaluating the final gain of the selection operation. Here, we propose a new algorithm to address this challenge. The proposed algorithm maintains the box-constraints within a selection procedure using a feasible optional step-size. We systematically study and compare several related algorithms, and derive new theoretical results. Experiments on benchmark data sets show that our algorithm effectively improves the training speed without loss of accuracy.
更多查看译文
关键词
Support vector machines,Decomposition algorithm,Working set selection,Sequential minimal optimization,Feasible step-size
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络