谷歌浏览器插件
订阅小程序
在清言上使用

FedGAMMA: Federated Learning with Global Sharpness-Aware Minimization.

IEEE transactions on neural networks and learning systems(2024)

引用 0|浏览126
暂无评分
摘要
Federated learning (FL) is a promising framework for privacy-preserving and distributed training with decentralized clients. However, there exists a large divergence between the collected local updates and the expected global update, which is known as the client drift and mainly caused by heterogeneous data distribution among clients, multiple local training steps, and partial client participation training. Most existing works tackle this challenge based on the empirical risk minimization (ERM) rule, while less attention has been paid to the relationship between the global loss landscape and the generalization ability. In this work, we propose FedGAMMA, a novel FL algorithm with Global sharpness-Aware MiniMizAtion to seek a global flat landscape with high performance. Specifically, in contrast to FedSAM which only seeks the local flatness and still suffers from performance degradation when facing the client-drift issue, we adopt a local varieties control technique to better align each client's local updates to alleviate the client drift and make each client heading toward the global flatness together. Finally, extensive experiments demonstrate that FedGAMMA can substantially outperform several existing FL baselines on various datasets, and it can well address the client-drift issue and simultaneously seek a smoother and flatter global landscape.
更多
查看译文
关键词
Client-drift,deep learning,distributed learning,federated learning (FL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要