A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation.

JOURNAL OF MACHINE LEARNING RESEARCH(2017)

引用 168|浏览30
暂无评分
摘要
Gaussian processes (GPs) are flexible distributions over functions that enable highlevel assumptions about unknown functions to be encoded in a parsimonious, flexible and general way. Although elegant, the application of GPs is limited by computational and analytical intractabilities that arise when data are su ffi ciently numerous or when employing non-Gaussian models. Consequently, a wealth of GP approximation schemes have been developed over the last 15 years to address these key limitations. Many of these schemes employ a small set of pseudo data points to summarise the actual data. In this paper we develop a new pseudo-point approximation framework using Power Expectation Propagation (Power EP) that uni fi es a large number of these pseudo-point approximations. Unlike much of the previous venerable work in this area, the new framework is built on standard methods for approximate inference (variational free-energy, EP and Power EP methods) rather than employing approximations to the probabilistic generative model itself. In this way all of the approximation is performed at ` inference time' rather than at ` modelling time', resolving awkward philosophical and empirical questions that trouble previous approaches. Crucially, we demonstrate that the new framework includes new pseudo-point approximation methods that outperform current approaches on regression and classi fi cation tasks.
更多
查看译文
关键词
Gaussian process,expectation propagation,variational inference,sparse approximation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要