Provably accelerated decentralized gradient methods over unbalanced directed graphs

SIAM JOURNAL ON OPTIMIZATION(2024)

引用 0|浏览2
暂无评分
摘要
We consider the decentralized optimization problem, where a network of n agents aims to collaboratively minimize the average of their individual smooth and convex objective functions through peer -to -peer communication in a directed graph. To tackle this problem, we propose two accelerated gradient tracking methods, namely Accelerated Push-DIGing (APD) and APD-SC, for non -strongly convex and strongly convex objective functions, respectively. We show that APD \sqrt{} \mu and APD-SC converge at the rates O(k1 2 ) and O((1 - C L )k), respectively, up to constant factors depending only on the mixing matrix. APD and APD-SC are the first decentralized methods over unbalanced directed graphs that achieve the same provable acceleration as centralized methods. Numerical experiments demonstrate the effectiveness of both methods.
更多
查看译文
关键词
methods involving communication compression,asynchronous updates,composite op
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要