Sequential Monte Carlo optimization and statistical inference

WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS(2022)

引用 0|浏览1
暂无评分
摘要
Sequential Monte Carlo (SMC) is a powerful technique originally developed for particle filtering and Bayesian inference. As a generic optimizer for statistical and nonstatistical objectives, its role is far less known. Density-tempered SMC is a highly efficient sampling technique ideally suited for challenging global optimization problems and is implementable with a somewhat arbitrary initialization sampler instead of relying on a prior distribution. SMC optimization is anchored at the fact that all optimization tasks (continuous, discontinuous, combinatorial, or noisy objective function) can be turned into sampling under a density or probability function short of a norming constant. The point with the highest functional value is the SMC estimate for the maximum. Through examples, we systematically present various density-tempered SMC algorithms and their superior performance vs. other techniques like Markov Chain Monte Carlo. Data cloning and k-fold duplication are two easily implementable accuracy accelerators, and their complementarity is discussed. The Extreme Value Theorem on the maximum order statistic can also help assess the quality of the SMC optimum. Our coverage includes the algorithmic essence of the density-tempered SMC with various enhancements and solutions for (1) a bimodal nonstatistical function without and with constraints, (2) a multidimensional step function, (3) offline and online optimizations, (4) combinatorial variable selection, and (5) noninvertibility of the Hessian. This article is categorized under: Statistical and Graphical Methods of Data Analysis > Monte Carlo Methods Algorithms and Computational Methods > Stochastic Optimization Algorithms and Computational Methods > Integer Programming
更多
查看译文
关键词
combinatorial optimization,data cloning,density tempering,extreme value
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要