Hierarchically branched diffusion models leverage dataset structure for class-conditional generation
arxiv(2022)
摘要
Class-labeled datasets, particularly those common in scientific domains, are
rife with internal structure, yet current class-conditional diffusion models
ignore these relationships and implicitly diffuse on all classes in a flat
fashion. To leverage this structure, we propose hierarchically branched
diffusion models as a novel framework for class-conditional generation.
Branched diffusion models rely on the same diffusion process as traditional
models, but learn reverse diffusion separately for each branch of a hierarchy.
We highlight several advantages of branched diffusion models over the current
state-of-the-art methods for class-conditional diffusion, including extension
to novel classes in a continual-learning setting, a more sophisticated form of
analogy-based conditional generation (i.e. transmutation), and a novel
interpretability into the generation process. We extensively evaluate branched
diffusion models on several benchmark and large real-world scientific datasets
spanning many data modalities.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要