On Independent Samples Along the Langevin Diffusion and the Unadjusted Langevin Algorithm
CoRR(2024)
摘要
We study the rate at which the initial and current random variables become
independent along a Markov chain, focusing on the Langevin diffusion in
continuous time and the Unadjusted Langevin Algorithm (ULA) in discrete time.
We measure the dependence between random variables via their mutual
information. For the Langevin diffusion, we show the mutual information
converges to 0 exponentially fast when the target is strongly log-concave,
and at a polynomial rate when the target is weakly log-concave. These rates are
analogous to the mixing time of the Langevin diffusion under similar
assumptions. For the ULA, we show the mutual information converges to 0
exponentially fast when the target is strongly log-concave and smooth. We prove
our results by developing the mutual version of the mixing time analyses of
these Markov chains. We also provide alternative proofs based on strong data
processing inequalities for the Langevin diffusion and the ULA, and by showing
regularity results for these processes in mutual information.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要