Joint Training of Variational Auto-Encoder and Latent Energy-Based Model

CVPR(2020)

引用 39|浏览112
暂无评分
摘要
This paper proposes a joint training method to learn both the variational auto-encoder (VAE) and the latent energy-based model (EBM). The joint training of VAE and latent EBM are based on an objective function that consists of three Kullback-Leibler divergences between three joint distributions on the latent vector and the image, and the objective function is of an elegant symmetric and anti-symmetric form of divergence triangle that seamlessly integrates variational and adversarial learning. In this joint training scheme, the latent EBM serves as a critic of the generator model, while the generator model and the inference model in VAE serve as the approximate synthesis sampler and inference sampler of the latent EBM. Our experiments show that the joint training greatly improves the synthesis quality of the VAE. It also enables learning of an energy function that is capable of detecting out of sample examples for anomaly detection.
更多
查看译文
关键词
variational auto-encoder,latent energy-based model,joint training method,VAE,latent EBM,objective function,Kullback-Leibler divergences,latent vector,elegant symmetric form,seamlessly integrates variational learning,adversarial learning,joint training scheme,generator model,inference model,approximate synthesis sampler,inference sampler,energy function,antisymmetric form,divergence triangle,anomaly detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要