Probabilistic Inference of Bayesian Neural Networks with Generalized Expectation Propagation

Neurocomputing(2020)

引用 16|浏览34
暂无评分
摘要
Abstract Deep learning plays an important role in the field of machine learning. However, deterministic methods such as neural networks cannot capture the model uncertainty. Bayesian neural network (BNN) are recently under consideration since Bayesian models provide a theoretical framework to infer model uncertainty. Since it is often difficult to find an analytical solution for BNNs, an effective and efficient approximate inference method is very important for model training and prediction. The generalized version of expectation propagation (GEP) was recently proposed and considered a powerful approximate inference method, which is based on the minimization of Kullback-Leibler (KL) divergence of the true posterior and the approximate distributions. In this paper, we further instantiate the GEP to provide an effective and efficient approximate inference method for BNNs. We assess this method on BNNs including fully connected neural networks and convolutional neural networks on multiple benchmark datasets and show a better performance than some state-of-the-art approximate inference methods.
更多
查看译文
关键词
Generalized expectation propagation,Bayesian neural networks,Approximate inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要