谷歌浏览器插件
订阅小程序
在清言上使用

Kernel Learning with Nonconvex Ramp Loss

Statistical analysis and data mining(2022)

引用 1|浏览4
暂无评分
摘要
We study the kernel learning problems with ramp loss, a nonconvex but noise-resistant loss function. In this work, we justify the validity of ramp loss under the classical kernel learning framework. In particular, we show that the generalization bound for empirical ramp risk minimizer is similar to that of convex surrogate losses, which implies kernel learning with such loss function is not only noise-resistant but, more importantly, statistically consistent. For adapting to real-time data streams, we introduce PA-ramp, a heuristic online algorithm based on the passive-aggressive framework, to solve this learning problem. Empirically, with fewer support vectors, this algorithm achieves robust empirical performances on tested noisy scenarios.
更多
查看译文
关键词
generalization bound,kernel methods,passive-aggressive algorithm,ramp loss,statistical consistency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要