Variational Inference with Orthogonal Normalizing Flows

Bayesian Deep Learning @ NIPS(2017)

引用 6|浏览16
暂无评分
摘要
Normalizing flows Variational inference relies on flexible approximate posterior distributions. In many settings very simple posteriors such as diagonal covariance Gaussians are used. Rezende and Mohamed [2015] propose a way to construct more flexible posteriors by transforming a simple base distribution with a series of invertible transformations with easily computable Jacobians. The resulting transformed density after one such transformation is given by:
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要