Chrome Extension
WeChat Mini Program
Use on ChatGLM

Particle Denoising Diffusion Sampler

ICML 2024(2024)

University of Oxford | Department of Statistics | Oxford | Oxford University

Cited 0|Views37
Abstract
Denoising diffusion models have become ubiquitous for generative modeling.The core idea is to transport the data distribution to a Gaussian by using adiffusion. Approximate samples from the data distribution are then obtained byestimating the time-reversal of this diffusion using score matching ideas. Wefollow here a similar strategy to sample from unnormalized probabilitydensities and compute their normalizing constants. However, the time-reverseddiffusion is here simulated by using an original iterative particle schemerelying on a novel score matching loss. Contrary to standard denoisingdiffusion models, the resulting Particle Denoising Diffusion Sampler (PDDS)provides asymptotically consistent estimates under mild assumptions. Wedemonstrate PDDS on multimodal and high dimensional sampling tasks.
More
Translated text
Key words
Immersed Boundary Method
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers
Pieralberto Guarniero,Adam M. Johansen,Anthony Lee
2017

被引用61 | 浏览

C.L. Scheffer
1980

被引用40 | 浏览

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:本文提出了一种基于粒子方法的去噪扩散采样器(PDDS),通过新颖的得分匹配损失函数进行时间反转扩散,实现了对未归一化概率密度的采样和归一化常数计算,并在多模态和高维采样任务中展示了其渐近一致性。

方法】:作者采用了一种迭代粒子方案,结合新颖的得分匹配损失函数来模拟时间反转扩散过程,从而实现对未归一化概率密度的有效采样。

实验】:本文在多模态和高维采样任务上验证了PDDS的性能,具体数据集名称未提及,但实验结果表明PDDS在这些任务中提供了渐近一致的估计。