Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvex nonsmooth optimization

MATHEMATICS OF COMPUTATION(2024)

引用 0|浏览4
暂无评分
摘要
The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M. Nikolova and P. Tan [SIAM J. Optim. 29 (2019), pp. 2053-2078]) on solving a class of nonconvex nonsmooth optimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving blocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of a nonsmooth data-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the Kurdyka-Lojasiewicz property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.
更多
查看译文
关键词
Nonconvex nonsmooth optimization,variance reduction,proximity,sublinear convergence rate,error bound,Kurdyka-Lojasiewicz property
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要