S-ADDOPT: Decentralized Stochastic First-Order Optimization Over Directed Graphs
IEEE Control Systems Letters(2021)
摘要
In this letter, we study decentralized stochastic optimization to minimize a sum of smooth and strongly convex cost functions when the functions are distributed over a directed network of nodes. In contrast to the existing work, we use gradient tracking to improve certain aspects of the resulting algorithm. In particular, we propose the S-ADDOPT algorithm that assumes a stochastic first-order oracle at each node and show that for a constant step-size α, each node converges linearly inside an error ball around the optimal solution, the size of which is controlled by α. For decaying step-sizes O(1/k), we show that S-ADDOPT reaches the exact solution sublinearly at O(1/k) and its convergence is asymptotically network independent. Thus the asymptotic behavior of S-ADDOPT is comparable to the centralized stochastic gradient descent. Numerical experiments over both strongly convex and nonconvex problems illustrate the convergence behavior and the performance comparison of the proposed algorithm.
更多查看译文
关键词
Stochastic optimization,first-order methods,decentralized algorithms,directed graphs,neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络