Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs

Systems & Control Letters(2022)

引用 3|浏览2
暂无评分
摘要
This paper considers the problem of designing accelerated gradient-based algorithms for optimization and saddle-point problems. The class of objective functions is defined by a generalized sector condition. This class of functions contains strongly convex functions with Lipschitz gradients but also non-convex functions, which allows not only to address optimization problems but also saddle-point problems. The proposed design procedure relies on a suitable class of Lyapunov functions and, for a fixed convergence rate, is a convex semi-definite program in all but one scalar parameter. The proposed synthesis allows the design of algorithms that reach the performance of state-of-the-art accelerated gradient methods and beyond.
更多
查看译文
关键词
Accelerated gradient methods,Linear matrix inequalities,Saddle point problems,Convex synthesis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要