Memoized Online Variational Inference for Dirichlet Process Mixture Models.

NIPS(2013)

引用 125|浏览45
暂无评分
摘要
Variational inference algorithms provide the most effective framework for large-scale training of Bayesian nonparametric models. Stochastic online approaches are promising, but are sensitive to the chosen learning rate and often converge to poor local optima. We present a new algorithm, memoized online variational inference, which scales to very large (yet finite) datasets while avoiding the complexities of stochastic gradient. Our algorithm maintains finite-dimensional sufficient statistics from batches of the full dataset, requiring some additional memory but still scaling to millions of examples. Exploiting nested families of variational bounds for infinite nonparametric models, we develop principled birth and merge moves allowing non-local optimization. Births adaptively add components to the model to escape local optima, while merges remove redundancy and improve speed. Using Dirichlet process mixture models for image clustering and denoising, we demonstrate major improvements in robustness and accuracy.
更多
查看译文
关键词
scaling factor,classification,data mining,feature extraction,information processing,experimental design,adaptive systems,algorithms,pattern recognition,bayes theorem,accuracy,statistical inference,optimization,dirichlet integral,gradients,stochastic processes,batch processing,redundancy,neural nets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要