Fake It Till Make It: Federated Learning with Consensus-Oriented Generation
CoRR(2023)
摘要
In federated learning (FL), data heterogeneity is one key bottleneck that
causes model divergence and limits performance. Addressing this, existing
methods often regard data heterogeneity as an inherent property and propose to
mitigate its adverse effects by correcting models. In this paper, we seek to
break this inherent property by generating data to complement the original
dataset to fundamentally mitigate heterogeneity level. As a novel attempt from
the perspective of data, we propose federated learning with consensus-oriented
generation (FedCOG). FedCOG consists of two key components at the client side:
complementary data generation, which generates data extracted from the shared
global model to complement the original dataset, and
knowledge-distillation-based model training, which distills knowledge from
global model to local model based on the generated data to mitigate
over-fitting the original heterogeneous dataset. FedCOG has two critical
advantages: 1) it can be a plug-and-play module to further improve the
performance of most existing FL methods, and 2) it is naturally compatible with
standard FL protocols such as Secure Aggregation since it makes no modification
in communication process. Extensive experiments on classical and real-world FL
datasets show that FedCOG consistently outperforms state-of-the-art methods.
更多查看译文
关键词
Federated learning,data heterogeneity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要