Accelerated Variance Reduction Stochastic ADMM for Large-Scale Machine Learning

IEEE Transactions on Pattern Analysis and Machine Intelligence(2021)

引用 20|浏览1792
暂无评分
摘要
Recently, many stochastic variance reduced alternating direction methods of multipliers (ADMMs) (e.g., SAG-ADMM and SVRG-ADMM) have made exciting progress such as linear convergence rate for strongly convex (SC) problems. However, their best-known convergence rate for non-strongly convex (non-SC) problems is $\mathcal {O}(1/T)$更多
查看译文
关键词
Convex functions,Acceleration,Convergence,Stochastic processes,Optimization,Machine learning,Complexity theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要