Revisiting Theoretical Guarantees of Direct-Search Methods

arxiv(2024)

引用 0|浏览0
暂无评分
摘要
Optimizing a function without using derivatives is a challenging paradigm, that precludes from using classical algorithms from nonlinear optimization and may thus seem intractable other than by using heuristics. However, the field of derivative-free optimization has succeeded in producing algorithms that do not rely on derivatives and yet are endowed with convergence guarantees. One class of such methods, called direct search, is particularly popular thanks to its simplicity of implementation, even though its theoretical underpinnings are not always easy to grasp. In this work, we survey contemporary direct-search algorithms from a theoretical viewpoint, with the aim of highlighting the key theoretical features of these methods. Our study goes beyond the classical, textbook cases and tackles the presence of nonsmoothness, noise, and constraints in the problem at hand. In addition to reviewing classical results in the field, we provide new perspectives on existing results, as well as novel proofs that illustrate the versatility of direct-search schemes.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要