ADAPTIVE RISK BOUNDS IN UNIVARIATE TOTAL VARIATION DENOISING AND TREND FILTERING

ANNALS OF STATISTICS(2020)

引用 59|浏览45
暂无评分
摘要
We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given integer r >= 1, the rth order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute rth order discrete derivatives of the fitted function at the design points. For r = 1, the estimator reduces to total variation regularization which has received much attention in the statistics and image processing literature. In this paper, we study the performance of the trend filtering estimator for every r >= 1, both in the constrained and penalized forms. Our main results show that in the strong sparsity setting when the underlying function is a (discrete) spline with few "knots," the risk (under the global squared error loss) of the trend filtering estimator (with an appropriate choice of the tuning parameter) achieves the parametric n(-1)-rate, up to a logarithmic (multiplicative) factor. Our results therefore provide support for the use of trend filtering, for every r >= 1, in the strong sparsity setting.
更多
查看译文
关键词
Adaptive splines,discrete splines,fat shattering,higher order total variation regularization,metric entropy bounds,nonparametric function estimation,risk bounds,subdifferential,tangent cone
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要