Communication efficient distributed learning of neural networks in Big Data environments using Spark

2021 IEEE International Conference on Big Data (Big Data)(2021)

引用 2|浏览2
暂无评分
摘要
Distributed (or federated) training of neural networks is an important approach to reduce the training time significantly. Previous experiments on communication efficient distributed learning have shown that model averaging, even if provably correct only in case of convex loss functions, is also working for the training of neural networks in some cases, however restricted to simple examples with r...
更多
查看译文
关键词
Training,Deep learning,Computer aided instruction,Distance learning,Neural networks,Distributed databases,Big Data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要