A New Stochastic Limited Memory Bfgs Algorithm

JOURNAL OF MATHEMATICAL EXTENSION(2020)

引用 23|浏览13
暂无评分
摘要
In this paper, a new limited memory BFGS is proposed for solving stochastic optimization problems. Since the cost of storing and manipulating H-k is prohibitive in the large scale setting, the LBFGS algorithms use the strategy of keeping the most recent correction pairs. Besides, in the stochastic regime, due to some noisy information in both gradient vector and Hessian approximation, the second-order model is not an accurate estimation of the function. To overcome this problem, our L-BFGS employs memory in an optimal manner by storing the correction pairs that have the least violation in the secant equation. Under some standard assumptions, the convergence property of the new algorithm is established for strongly convex functions. Numerical results on the problems arising in machine learning show that the new method is competitive and effective in practice.
更多
查看译文
关键词
Limited memory BFGS (L-BFGS), stochastic optimization, secant equation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要