Weak consistency and stochastic environments: harmonization of replicated machine learning models.

PaPoC@EuroSys(2016)

引用 0|浏览20
暂无评分
摘要
Many machine learning (ML) models are of a stochastic nature. We aim to combine the principles of weak consistency with large scale distributed machine learning. We see interesting opportunities in this domain in (1) perceiving parallel ML algorithms based on model replication as a \"collaborative task\" where local progress on models is instantaneously exchanged and by (2) making this exchange more efficient by exploiting the underlying stochastic nature. Based on this motivation, we extend the notion of consistency for replicated objects with intrinsic stochastic structure and introduce harmonization as the reconciliation principle to enable efficient consistency maintenance of these objects. We present as a concrete application the harmonization of replicated ML models.
更多
查看译文
关键词
Distributed systems, weak consistency, large-scale machine learning, stochastic gradient descent
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要