Approximately Exact Line Search

arxiv(2020)

引用 0|浏览37
暂无评分
摘要
We propose approximately exact line search, which uses only function evaluations to select a step size within a constant fraction of the exact line search minimizer. We bound the number of iterations and function evaluations, showing linear convergence on smooth, strongly convex objectives with no dependence on the initial step size for three variations of gradient descent: using the true gradient, using an approximate gradient, and using a random search direction. We demonstrate experimental speedup on logistic regression for both gradient descent and minibatch stochastic gradient descent and on a benchmark set of derivative-free optimization objectives using quasi-Newton search directions.
更多
查看译文
关键词
exact line search
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要