Regress Consistently when Oblivious Outliers Overwhelm

arxiv(2020)

引用 1|浏览9
暂无评分
摘要
We give a novel analysis of the Huber loss estimator for consistent robust linear regression proving that it simultaneously achieves an optimal dependency on the fraction of outliers and on the dimension. We consider a linear regression model with an oblivious adversary, who may corrupt the observations in an arbitrary way but without knowing the data. (This adversary model also captures heavy-tailed noise distributions). Given observations $y_1,\ldots,y_n$ with an $\alpha$ uncorrupted fraction, we obtain error guarantees $\tilde{O}(\sqrt{d/\alpha^2\cdot n})$, optimal up to logarithmic terms. Our algorithm works with a nearly optimal fraction of inliers $\alpha\geq \tilde{O}(\sqrt{d/n})$ and under mild restricted isometry assumptions (RIP) on the (transposed) design matrix. Prior to this work, even in the simple case of spherical Gaussian design, no estimator was known to achieve vanishing error guarantees in the high dimensional settings $d\gtrsim \sqrt{n}$, whenever the fraction of uncorrupted observations is smaller than $1/\log n$. Our analysis of the Huber loss estimator only exploits the first order optimality conditions. Furthermore, in the special case of Gaussian design $X\sim N(0,1)^{n \times d}$, we show that a strikingly simple algorithm based on computing coordinate-wise medians achieves similar guarantees in linear time. The algorithm also extends to the settings where the parameter vector $\beta^*$ is sparse.
更多
查看译文
关键词
oblivious outliers overwhelm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要