SVM-Optimization and Steepest-Descent Line Search
COLT(2009)
摘要
We consider (a subclass of) convex qua- dratic optimization problems and analyze decomposition algorithms that perform, at least approximately, steepest-descent ex- act line search. We show that these al- gorithms, when implemented properly, are within ǫ of optimality after O(log1/ǫ) it- erations for strictly convex cost functions, and after O(1/ǫ) iterations in the general case.1 Our analysis is general enough to cover the algorithms that are used in soft- ware packages like SVMTorch and (first or second order) LibSVM. To the best of our knowledge, this is the first paper coming up with a convergence rate for these algo- rithms without introducing unnecessarily restrictive assumptions.
更多查看译文
关键词
line search,second order,convergence rate,cost function,steepest descent,optimization problem
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络