谷歌浏览器插件
订阅小程序
在清言上使用

Direct Search Based on Probabilistic Descent

SIAM JOURNAL ON OPTIMIZATION(2015)

引用 39|浏览34
暂无评分
摘要
Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets, which in turn must have at least n + 1 vectors in an n-dimensional variable space. In addition, to ensure the global convergence of these algorithms, the positive spanning sets used throughout the iterations are required to be uniformly nondegenerate in the sense of having a positive (cosine) measure bounded away from zero. However, recent numerical results indicated that randomly generating the polling directions without imposing the positive spanning property can improve the performance of these methods, especially when the number of directions is chosen as considerably less than n + 1. In this paper, we analyze direct-search algorithms when the polling directions are probabilistic descent, meaning that with a certain probability at least one of them is of descent type. Such a framework enjoys almost-sure global convergence. More interestingly, we will show a global decaying rate of 1/root k for the gradient size, with overwhelmingly high probability, matching the corresponding rate for the deterministic versions of the gradient method or of direct search. Our analysis helps us understand numerical behavior and the choice of the number of polling directions.
更多
查看译文
关键词
derivative-free optimization,direct-search methods,polling,positive spanning sets,probabilistic descent,random directions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要