FedHEONN: Federated and homomorphically encrypted learning method for one-layer neural networks

FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE(2023)

引用 0|浏览3
暂无评分
摘要
Federated learning (FL) is a distributed approach to developing collaborative learning models from decentralized data. This is relevant to many real applications, such as in the field of the Internet of Things, since the models can be used in edge computing devices. FL approaches are motivated by and designed to protect privacy, a highly relevant issue given current data protection regulations. Although FL methods are privacy-preserving by design, recently published papers show that privacy leaks do occur, caused by attacks designed to extract private data from information interchanged during learning. In this work, we present an FL method based on a neural network without hidden layers that incorporates homomorphic encryption (HE) to enhance robustness against the above-mentioned attacks. Unlike traditional FL methods that require multiple rounds of training for convergence, our method obtains the collaborative global model in a single training round, yielding an effective and efficient model that simplifies management of the FL training process. In addition, since our method includes HE, it is also robust against model inversion attacks. In experiments with big data sets and a large number of clients in a federated scenario, we demonstrate that use of HE does not affect the accuracy of the model, whose results are competitive with state-of-the-art machine learning models. We also show that behavior in terms of accuracy is the same for identically and non-identically distributed data scenarios.
更多
查看译文
关键词
Federated learning,Neural networks,Homomorphic encryption,Privacy-preserving,Edge computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要