A Geometrical Analysis of Kernel Ridge Regression and its Applications
arxiv(2024)
摘要
We obtain upper bounds for the estimation error of Kernel Ridge Regression
(KRR) for all non-negative regularization parameters, offering a geometric
perspective on various phenomena in KRR. As applications: 1. We address the
multiple descent problem, unifying the proofs of arxiv:1908.10292 and
arxiv:1904.12191 for polynomial kernels and we establish multiple descent for
the upper bound of estimation error of KRR under sub-Gaussian design and
non-asymptotic regimes. 2. For a sub-Gaussian design vector and for
non-asymptotic scenario, we prove the Gaussian Equivalent Conjecture. 3. We
offer a novel perspective on the linearization of kernel matrices of non-linear
kernel, extending it to the power regime for polynomial kernels. 4. Our theory
is applicable to data-dependent kernels, providing a convenient and accurate
tool for the feature learning regime in deep learning theory. 5. Our theory
extends the results in arxiv:2009.14286 under weak moment assumption.
Our proof is based on three mathematical tools developed in this paper that
can be of independent interest: 1. Dvoretzky-Milman theorem for ellipsoids
under (very) weak moment assumptions. 2. Restricted Isomorphic Property in
Reproducing Kernel Hilbert Spaces with embedding index conditions. 3. A
concentration inequality for finite-degree polynomial kernel functions.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要