Dual Adversarial Federated Learning on Non-IID Data

KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III(2022)

引用 0|浏览20
暂无评分
摘要
Federated Learning (FL) enables multiple distributed local clients to coordinate with a central server to train a global model without sharing their private data. However, the data owned by different clients, even with the same label, may induce conflicts in the latent feature maps, especially under the non-IID FL scenarios. This would fatally impair the performance of the global model. To this end, we propose a novel approach, DAFL, for Dual Adversarial Federated Learning, to mitigate the divergence on latent feature maps among different clients on non-IID data. In particular, a local dual adversarial training is designed to identify the origins of latent feature maps, and then transforms the conflicting latent feature maps to reach a consensus between global and local models in each client. Besides, the latent feature maps of the two models become closer to each other adaptively by reducing their Kullback Leibler divergence. Extensive experiments on benchmark datasets validate the effectiveness of DAFL and also demonstrate that DAFL outperforms the state-of-the-art approaches in terms of test accuracy under different non-IID settings.
更多
查看译文
关键词
Federated learning, Non-IID data, Latent feature map, Dual adversarial training, Kullback Leibler divergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要