Energy-Inspired Models: Learning with Sampler-Induced Distributions

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)(2019)

引用 31|浏览0
暂无评分
摘要
Energy-based models (EBMs) are powerful probabilistic models [8, 43], but suffer from intractable sampling and density evaluation due to the partition function. As a result, inference in EBMs relies on approximate sampling algorithms, leading to a mismatch between the model and inference. Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model. This yields a class of energy-inspired models (EIMs) that incorporate learned energy functions while still providing exact samples and tractable log-likelihood lower bounds. We describe and evaluate three instantiations of such models based on truncated rejection sampling, self-normalized importance sampling, and Hamiltonian importance sampling. These models outperform or perform comparably to the recently proposed Learned Accept/Reject Sampling algorithm [5] and provide new insights on ranking Noise Contrastive Estimation [33, 45] and Contrastive Predictive Coding [55]. Moreover, EIMs allow us to generalize a recent connection between multi-sample variational lower bounds [9] and auxiliary variable variational inference [1, 61, 57, 46]. We show how recent variational bounds [9, 48, 51, 41, 68, 50, 63] can be unified with EIMs as the variational family.
更多
查看译文
关键词
partition function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要