Online Orthogonal Regression Based On A Regularized Squared Loss

2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA)(2018)

引用 0|浏览62
暂无评分
摘要
Orthogonal regression extends the classical regression framework by assuming that the data may contain errors in both the dependent and independent variables. Often, this approach tends to outperform classical regression in real-world scenarios. However, the algorithms used to determine a solution to the orthogonal regression problem require the computation of singular value decompositions (SVD), which may be computationally expensive and impractical for real-world problems. In this work, we propose a new approach to the orthogonal regression problem based on a regularized squared loss. The method follows an online learning strategy which makes it more flexible for different types of applications. The algorithm is derived in primal and dual variables and the later formulation allows the introduction of kernels for nonlinear modeling. We compare our proposed orthogonal regression algorithm to a corresponding classical regression algorithm using both synthetic and real-world datasets from different applications. Our algorithm achieved better results for most of the datasets.
更多
查看译文
关键词
Orthogonal regression, Online learning, Kernel
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要