On the powerball method

2017 29th Chinese Control And Decision Conference (CCDC)(2017)

引用 4|浏览54
暂无评分
摘要
We propose a new method to accelerate the convergence of optimization algorithms. This method simply adds a power coefficient γ ∊ [0, 1) to the gradient during optimization. We call this the Powerball method and analyze the convergence rate for the Powerball method for strongly convex functions and show that it has a faster convergence rate than gradient descent and Newton's method in the initial iterations. We also demonstrate that the Powerball method provides a 10-fold speed up of the convergence of both gradient descent and L-BFGS on multiple real datasets.
更多
查看译文
关键词
Optimization Method,Convex Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要