Neural Flow Diffusion Models: Learnable Forward Process for Improved Diffusion Modelling
arxiv(2024)
摘要
Conventional diffusion models typically relies on a fixed forward process,
which implicitly defines complex marginal distributions over latent variables.
This can often complicate the reverse process' task in learning generative
trajectories, and results in costly inference for diffusion models. To address
these limitations, we introduce Neural Flow Diffusion Models (NFDM), a novel
framework that enhances diffusion models by supporting a broader range of
forward processes beyond the fixed linear Gaussian. We also propose a novel
parameterization technique for learning the forward process. Our framework
provides an end-to-end, simulation-free optimization objective, effectively
minimizing a variational upper bound on the negative log-likelihood.
Experimental results demonstrate NFDM's strong performance, evidenced by
state-of-the-art likelihood estimation. Furthermore, we investigate NFDM's
capacity for learning generative dynamics with specific characteristics, such
as deterministic straight lines trajectories. This exploration underscores
NFDM's versatility and its potential for a wide range of applications.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要