Is Systematic Data Sharding able to Stabilize Asynchronous Parameter Server Training?

2021 IEEE International Conference on Big Data (Big Data)(2021)

引用 1|浏览13
暂无评分
摘要
Over the last years, deep learning has gained an increase in popularity in various domains introducing complex models to handle the data explosion. However, while such model architectures can support the enormous amount of data, a single computing node cannot train the model using the whole data set in a timely fashion. Thus, specialized distributed architectures have been proposed, most of which ...
更多
查看译文
关键词
Training,Measurement,Systematics,Computational modeling,Computer architecture,Big Data,Data models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要