Stochastic Gradient Mcmc For State Space Models

SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE(2019)

引用 21|浏览144
暂无评分
摘要
State space models (SSMs) are a flexible approach to modeling complex time series. However, inference in SSMs is often computationally prohibitive for long time series. Stochastic gradient Markov chain Monte Carlo (SGMCMC) is a popular method for scalable Bayesian inference for large independent data. Unfortunately, when applied to dependent data, such as in SSMs, SGMCMC's stochastic gradient estimates are biased, as they break crucial temporal dependencies. To alleviate this, we propose stochastic gradient estimators that control this bias by performing additional computation in a "buffer" to reduce breaking dependencies. Furthermore, we derive error bounds for this bias and show a geometric decay under mild conditions. Using these estimators, we develop novel SGMCMC samplers for discrete, continuous, and mixed-type SSMs with analytic message passing. Our experiments on real and synthetic data demonstrate the effectiveness of our SGMCMC algorithms compared to batch MCMC, allowing us to scale inference to long time series with millions of time points.
更多
查看译文
关键词
stochastic gradient, Markov chain Monte Carlo, Bayesian inference, state space models, hidden Markov models, time series, exponential forgetting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要