Variable metric inexact line-search based methods for nonsmooth optimization

SIAM JOURNAL ON OPTIMIZATION(2016)

引用 61|浏览65
暂无评分
摘要
We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonconvex, function plus a convex, possibly nondifferentiable, function. The key features of the proposed method are the definition of a suitable descent direction, based on the proximal operator associated to the convex part of the objective function, and an Armijo-like rule to determine the stepsize along this direction ensuring the sufficient decrease of the objective function. In this frame, we especially address the possibility of adopting a metric which may change at each iteration and an inexact computation of the proximal point defining the descent direction. For the more general nonconvex case, we prove that all limit points of the iterates sequence are stationary, while for convex objective functions we prove the convergence of the whole sequence to a minimizer, under the assumption that a minimizer exists. In the latter case, assuming also that the gradient of the smooth part of the objective function is Lipschitz, we also give a convergence rate estimate, showing the O(1/k) complexity with respect to the function values. We also discuss verifiable sufficient conditions for the inexact proximal point and present the results of two numerical tests on total-variation-based image restoration problems, showing that the proposed approach is competitive with other state-of-the-art methods.
更多
查看译文
关键词
proximal algorithms,nonsmooth optimization,generalized projection,nonconvex optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要