Privacy-Preserving Asynchronous Federated Learning Mechanism for Edge Network Computing.

IEEE ACCESS(2020)

引用 126|浏览632
暂无评分
摘要
In the traditional cloud architecture, data needs to be uploaded to the cloud for processing, bringing delays in transmission and response. Edge network emerges as the times require. Data processing on the edge nodes can reduce the delay of data transmission and improve the response speed. In recent years, the need for artificial intelligence of edge network has been proposed. However, the data of a single, individual edge node is limited and does not satisfy the conditions of machine learning. Therefore, performing edge network machine learning under the premise of data confidentiality became a research hotspot. This paper proposes a Privacy-Preserving Asynchronous Federated Learning Mechanism for Edge Network Computing (PAFLM), which can allow multiple edge nodes to achieve more efficient federated learning without sharing their private data. Compared with the traditional distributed learning, the proposed method compresses the communications between nodes and parameter server during the training process without affecting the accuracy. Moreover, it allows the node to join or quit in any process of learning, which can be suitable to the scene with highly mobile edge devices.
更多
查看译文
关键词
Edge computing,Cloud computing,Servers,Learning systems,Training,Machine learning,Data models,Federated learning,edge computing,privacy preservation,asynchronous distributed network,gradient compression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要