Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions

Numerical Algorithms(2019)

引用 20|浏览3
暂无评分
摘要
It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the trust region feature for nonconvex functions. This paper studies family conjugate gradient formulas based on the six classic formulas, PRP, HS, CD, FR, LS, and DY, where the family conjugate gradient algorithms have better theory properties than those of the formulas by themselves. Furthermore, this technique of the presented conjugate gradient formulas can be extended to any two-term conjugate gradient formula. This paper designs family conjugate gradient algorithms for nonconvex functions, which have the following features without other conditions: (i) the sufficient descent property holds, (ii) the trust region feature is true, and (iii) the global convergence holds under normal assumptions. Numerical results show that the given conjugate gradient algorithms are competitive with those of normal methods.
更多
查看译文
关键词
Nonconvex functions,Sufficient descent,Trust region,Inexact line search,Global convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要