Bayesian Generative Adversarial Nets with Dropout Inference

CODS-COMAD 2021: PROCEEDINGS OF THE 3RD ACM INDIA JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE & MANAGEMENT OF DATA (8TH ACM IKDD CODS & 26TH COMAD)(2021)

引用 3|浏览0
暂无评分
摘要
Generative adversarial networks are one of the most popular approaches to generate new data from complex high-dimensional data distributions. They have revolutionized the area of generative models by creating quality samples that highly resemble the true data distribution. However, these samples often cover only few high density areas of the true data distribution. As some of the modes are missing in the generated data, this issue is referred to as mode collapse. Bayesian GANs (BGANs) can address this to a great extend by considering Bayesian learning principles. Instead of learning point estimates of parameters in the network, BGANs learn a probability distribution over these parameters and make use of the posterior distribution over parameters to make prediction. As these models are huge neural networks, analytical inference is not feasible due to the intractable likelihood and evidence terms. Hence, BGANs perform an approximate inference based on stochastic gradient Hamiltonian Monte Carlo (SGHMC) sampling which is computationally expensive and displays convergence problems. We propose a simple and effective Bayesian GAN model based on Monte Carlo dropout based inference (BDGAN). We establish theoretical connection between variational inference in Bayesian GANs and Monte Carlo dropout in GANs. The effectiveness of the proposed model in overcoming mode collapse is demonstrated on various synthetic and real-world data sets. Additionally, we analyse the training time and memory usage to show case the proposed method's advantages over Bayesian GAN.
更多
查看译文
关键词
bayesian inference, generative adversarial nets, monte carlo dropout, mode collapse
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要