A Characteristic Function Approach to Deep Implicit Generative Modeling

CVPR(2020)

引用 35|浏览65
暂无评分
摘要
In this paper, we formulate the problem of learning an Implicit Generative Model (IGM) as minimizing the expected distance between characteristic functions. Specifically, we match the characteristic functions of the real and generated data distributions under a suitably-chosen weighting distribution. This distance measure, which we term as the characteristic function distance (CFD), can be (approximately) computed with linear time-complexity in the number of samples, compared to the quadratic-time Maximum Mean Discrepancy (MMD). By replacing the discrepancy measure in the critic of a GAN with the CFD, we obtain a model that is simple to implement and stable to train; the proposed metric enjoys desirable theoretical properties including continuity and differentiability with respect to generator parameters, and continuity in the weak topology. We further propose a variation of the CFD in which the weighting distribution parameters are also optimized during training; this obviates the need for manual tuning and leads to an improvement in test power relative to CFD. Experiments show that our proposed method outperforms WGAN and MMD-GAN variants on a variety of unsupervised image generation benchmark datasets.
更多
查看译文
关键词
IGM,effective data-driven models,expected distance,data distributions,distance metric,characteristic function distance,CFD,linear time-complexity,discrepancy measure,generator parameters,weighting distribution parameters,MMD-GAN,unsupervised image generation benchmarks,characteristic function approach,deep implicit generative modeling,generating samples
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要