FLUTE: A Scalable, Extensible Framework for High-Performance Federated Learning Simulations

Mirian Hipolito Garcia,Andre Manoel, Daniel Madrigal Diaz,Fatemehsadat Mireshghallah,Robert Sim,Dimitrios Dimitriadis

arxiv(2022)

引用 3|浏览14
暂无评分
摘要
In this paper we introduce "Federated Learning Utilities and Tools for Experimentation" (FLUTE), a high-performance open-source platform for federated learning research and offline simulations. The goal of FLUTE is to enable rapid prototyping and simulation of new federated learning algorithms at scale, including novel optimization, privacy, and communications strategies. We describe the architecture of FLUTE, enabling arbitrary federated modeling schemes to be realized. We compare the platform with other state-of-the-art platforms and describe available features of FLUTE for experimentation in core areas of active research, such as optimization, privacy, and scalability. A comparison with other established platforms shows speed-ups of up to 42x and savings in memory footprint of 3x. A sample of the platform capabilities is also presented for a range of tasks, as well as other functionality, such as linear scaling for the number of participating clients, and a variety of federated optimizers, including FedAdam, DGA, etcetera.
更多
查看译文
关键词
simulations,extensible framework,learning,high-performance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要