Improved Regression Models for Algorithm Configuration

PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'22)(2022)

引用 3|浏览1
暂无评分
摘要
Offline algorithm configuration methods search for fixed parameter values for a given set of problem instances. For each parameter, such methods perform an equivalent to a constant regression, since the parameter value remains constant for any problem instance. However, optimal parameter values may depend on instance features, such as the instance size. In this paper, we represent parameters by non-constant models, which set the parameter values according to the instance size. Instead of searching for parameter values directly, the configuration process calibrates such models. In particular, we propose a simple yet effective linear model, which approximates linear relations between instance size and optimal parameter values. For modeling nonlinear relations, we propose piecewise and log-log linear models. The evaluation of the proposed methods on four configuration scenarios show good performance gains in comparison to traditional instance-independent algorithm configuration with comparable tuning effort.
更多
查看译文
关键词
Instance-specific algorithm configuration, per-instance parameter tuning, parameter regression models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要