Chrome Extension
WeChat Mini Program
Use on ChatGLM

On the Differential Privacy in Federated Learning Based on Over-the-Air Computation.

ACM Trans. Web(2024)

Cited 0|Views16
No score
Abstract
The federated learning is a promising machine learning technique to bring about advanced services and application for future industries. It has been known that the federated learning secures the privacy of the participants well so far. However, various attacks appear recently which are possible to extract the private information of them in the federated learning systems. Consequently, development of privacy preserving schemes for the federated learning is paramount. In this paper, we consider the over-the-air computation based federated learning system, and adopt the concept of differential privacy to prevent the private information leakage. During the training process, when a sum of local gradients is received via over-the-air computation, they conceal each other and appear to be random to the parameter server. Motivated by this fact, the differential privacy of the over-the-air computation based federated learning is analyzed by considering the inherent randomness of the local gradients. We analytically quantify required amount of the artificial noise to be added to preserve privacy. Furthermore, a parameter estimation based algorithm is proposed which is applicable in real scenarios. The simulation results show the efficacy of the proposed algorithm for preserving privacy.
More
Translated text
Key words
Federated learning,over-the-air computation,differential privacy,central limit theorem
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined