Towards Faster Distributed Deep Learning Data Hashing Techniques

2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA)(2019)

引用 1|浏览26
暂无评分
摘要
Nowadays, deep learning is a crucial part of a variety of big data applications. Both the vast amount of data and the high complexity of the state-of-the-art neural networks have led to perform the network training in a distributed manner accross clusters. Since synchronization overheads are usually fatal for the training's performance, asynchronous training is usually preferred in such cases. However, this training mode is sensitive to conflicting updates. Such updates most commonly occur when the workers train on a totally different part of the data. To reduce this phenomenon, in this paper, we propose the use of hashing schemes when distributing training data across workers.
更多
查看译文
关键词
deep learning, data hashing, distributed training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要