Distributed Nesterov Gradient and Heavy-Ball Double Accelerated Asynchronous Optimization

IEEE Transactions on Neural Networks and Learning Systems(2021)

引用 21|浏览24
暂无评分
摘要
In this article, we come up with a novel Nesterov gradient and heavy-ball double accelerated distributed synchronous optimization algorithm, called NHDA, and adopt a general asynchronous model to further propose an effective asynchronous algorithm, called ASY-NHDA, for distributed optimization problem over directed graphs, where each agent has access to a local objective function and computes the ...
更多
查看译文
关键词
Optimization,Convergence,Delays,Acceleration,Directed graphs,Approximation algorithms,Computational modeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要