谷歌浏览器插件
订阅小程序
在清言上使用

Coded Decentralized Learning with Gradient Descent for Big Data Analytics

IEEE communications letters(2020)

引用 6|浏览25
暂无评分
摘要
Machine learning is an effective technique for big data analytics. We focus on the study of big data analytics with decentralized learning in large-scale networks. Fountain codes are applied to the decentralized learning process to reduce communication load for exchanging intermediate learning parameters among fog nodes. Two scenarios, i.e., disjoint datasets and overlapping datasets, are analyzed. Comparison results show that communication load can be reduced significantly by the Fountain-based scheme for large-scale networks, especially when the quality of communication links is relatively bad and/or the number of fog nodes is large.
更多
查看译文
关键词
Big Data,Encoding,Decoding,f noise,Task analysis,Generators,Machine learning,Big data,decentralized learning,gradient descent,Fountain codes,communication load
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要