Improved Contrastive Divergence Training Of Energy-Based Models
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139(2021)
摘要
Contrastive divergence is a popular method of training energy-based models, but is known to have difficulties with training stability. We propose an adaptation to improve contrastive divergence training by scrutinizing a gradient term that is difficult to calculate and is often left out for convenience. We show that this gradient term is numerically significant and in practice is important to avoid training instabilities, while being tractable to estimate. We further highlight how data augmentation and multi-scale processing can be used to improve model robustness and generation quality. Finally, we empirically evaluate stability of model architectures and show improved performance on a host of benchmarks and use cases,such as image generation, OOD detection, and compositional generation.
更多查看译文
关键词
contrastive divergence training,models,energy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络