Linearized ADMM for Non-convex Non-smooth Optimization with Convergence Analysis.
IEEE ACCESS(2019)
摘要
Linearized alternating direction method of multipliers (ADMM) as an extension of ADMM has been widely used to solve linearly constrained problems in signal processing, machine learning, communications, and many other fields. Despite its broad applications in nonconvex optimization, for a great number of nonconvex and nonsmooth objective functions, its theoretical convergence guarantee is still an open problem. In this paper, we propose a two-block linearized ADMM and a multi-block parallel linearized ADMM for problems with nonconvex and nonsmooth objectives. Mathematically, we present that the algorithms can converge for a broader class of objective functions under less strict assumptions compared with previous works. Furthermore, our proposed algorithm can update coupled variables in parallel and work for less restrictive nonconvex problems, where the traditional ADMM may have difficulties in solving subproblems.
更多查看译文
关键词
Linearized ADMM,multi-block ADMM,nonconvex optimization,parallel computation,proximal algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络