Multifidelity Bayesian Optimization for Binomial Output.

arXiv: Machine Learning(2019)

引用 23|浏览10
暂无评分
摘要
key idea of Bayesian optimization is replacing an expensive target function with a cheap surrogate model. By selection of an acquisition function for Bayesian optimization, we trade off between exploration and exploitation. acquisition function typically depends on the mean and the variance of the surrogate model at a given point. The most common Gaussian process-based surrogate model assumes that the target with fixed parameters is a realization of a Gaussian process. However, often the target function doesnu0027t satisfy this approximation. Here we consider target functions that come from the binomial distribution with the parameter that depends on inputs. Typically we can vary how many Bernoulli samples we obtain during each evaluation. We propose a general Gaussian process model that takes into account Bernoulli outputs. To make things work we consider a simple acquisition function based on Expected Improvement and a heuristic strategy to choose the number of samples at each point thus taking into account precision of the obtained output.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要