Support vector ordinal regression.

Neural Computation(2007)

引用 339|浏览5
暂无评分
摘要
In this letter, we propose two new support vector approaches for ordinal regression, which optimize multiple thresholds to define parallel discriminant hyperplanes for the ordinal scales. Both approaches guarantee that the thresholds are properly ordered at the optimal solution. The size of these optimization problems is linear in the number of training samples. The sequential minimal optimization algorithm is adapted for the resulting optimization problems; it is extremely easy to implement and scales efficiently as a quadratic function of the number of examples. The results of numerical experiments on some benchmark and real-world data sets, including applications of ordinal regression to information retrieval, verify the usefulness of these approaches.
更多
查看译文
关键词
numerical experiment,ordinal regression,optimal solution,information retrieval,parallel discriminant hyperplanes,support vector ordinal regression,optimization problem,sequential minimal optimization algorithm,new support vector approach,multiple threshold,ordinal scale,sequential minimal optimization,support vector
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要