Exact O(N2) Hyper-Parameter Optimization for Gaussian Process Regression

2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)(2020)

引用 1|浏览68
暂无评分
摘要
Hyper-parameter optimization remains as the core issue of Gaussian process (GP) for machine learning nowadays. The benchmark method using maximum likelihood (ML) estimation and gradient descent (GD) is impractical for processing big data due to its O(n 3 ) complexity. Many sophisticated global or local approximation models have been proposed to address such complexity issue. In this paper, we propose two novel and exact GP hyper-parameter training schemes by replacing ML with cross-validation (CV) as the fitting criterion and replacing GD with a non-linearly constrained alternating direction method of multipliers (ADMM) as the optimization method. The proposed schemes are of O(n 2 ) complexity for any covariance matrix without special structure. We conduct experiments based on synthetic and real datasets, wherein the proposed schemes show excellent performance in terms of convergence, hyper-parameter estimation, and computational time in comparison with the traditional ML based routines.
更多
查看译文
关键词
Gaussian process,hyper-parameter optimization,ADMM,cross-validation,low complexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要