Variational Inference for DPGMM with Coresets

user-5f165ac04c775ed682f5819f(2017)

引用 0|浏览14
暂无评分
摘要
Performing estimation and inference on massive datasets under time and memory constraints is a critical task in machine learning. One approach to tackle these challenges is offered by coresets, succinct data summaries that come with strong theoretical guarantees, and can operate under computational resource restrictions. In this work, we explore how such data summaries can be used in posterior inference through variational methods. We develop a novel coreset construction for approximate posterior inference in the nonparametric Dirichlet process Gaussian mixture model. We empirically demonstrate how our method allows trading small approximation error for large gains in runtime and memory usage.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要