An analysis of the noise schedule for score-based generative models
arxiv(2024)
摘要
Score-based generative models (SGMs) aim at estimating a target data
distribution by learning score functions using only noise-perturbed samples
from the target. Recent literature has focused extensively on assessing the
error between the target and estimated distributions, gauging the generative
quality through the Kullback-Leibler (KL) divergence and Wasserstein distances.
All existing results have been obtained so far for time-homogeneous speed of
the noise schedule. Under mild assumptions on the data distribution, we
establish an upper bound for the KL divergence between the target and the
estimated distributions, explicitly depending on any time-dependent noise
schedule. Assuming that the score is Lipschitz continuous, we provide an
improved error bound in Wasserstein distance, taking advantage of favourable
underlying contraction mechanisms. We also propose an algorithm to
automatically tune the noise schedule using the proposed upper bound. We
illustrate empirically the performance of the noise schedule optimization in
comparison to standard choices in the literature.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要