String-averaging incremental stochastic subgradient algorithms

R. M. Oliveira,Elias S. Helou,E. F. Costa

OPTIMIZATION METHODS & SOFTWARE(2019)

引用 1|浏览23
暂无评分
摘要
We present a method to solve constrained convex stochastic optimization problems when the objective is a finite sum of convex functions . Our method is based on Incremental Stochastic Subgradient Algorithms and String-Averaging techniques, with an assumption that the subgradient directions are affected by random errors in each iteration. Our analysis allows the method to perform approximate projections onto the feasible set in each iteration. We provide convergence results for the case where a diminishing step-size rule is used. We test our method in a large set of random instances of a stochastic convex programming problem and we compare its performance with the robust mirror descent stochastic approximation algorithm proposed in Nemirovski et al. (Robust stochastic approximation approach to stochastic programming, SIAM J Optim 19 (2009), pp. 15741609).
更多
查看译文
关键词
Convex optimization,stochastic optimization,incremental algorithms,stochastic subgradient methods,approximate projection methods,string-averaging algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要