FedPD: A Federated Learning Framework With Adaptivity to Non-IID Data

IEEE TRANSACTIONS ON SIGNAL PROCESSING(2021)

引用 233|浏览228
暂无评分
摘要
Federated Learning (FL) is popular for communication-efficient learning from distributed data. To utilize data at different clients without moving them to the cloud, algorithms such as the Federated Averaging (FedAvg) have adopted a computation then aggregation model, in which multiple local updates are performed using local data before aggregation. These algorithms fail to work when faced with practical challenges, e.g., the local data being non-identically independently distributed. In this paper, we first characterize the behavior of the FedAvg algorithm, and show that without strong and unrealistic assumptions on the problem structure, it can behave erratically. Aiming at designing FL algorithms that are provably fast and require as few assumptions as possible, we propose a new algorithm design strategy from the primal-dual optimization perspective. Our strategy yields algorithms that can deal with non-convex objective functions, achieves the best possible optimization and communication complexity (in a well-defined sense), and accommodates full-batch and mini-batch local computation models. Importantly, the proposed algorithms are communication efficient , in that the communication effort can be reduced when the level of heterogeneity among the local data also reduces. In the extreme case where the local data becomes homogeneous, only $\mathcal {O}(1)$ communication is required among the agents. To the best of our knowledge, this is the first algorithmic framework for FL that achieves all the above properties.
更多
查看译文
关键词
Signal processing algorithms, Protocols, Complexity theory, Servers, Computational modeling, Data models, Distributed databases, Distributed algorithms, machine learning algorithms, federated learning, data heterogeneity, convergence analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要