pfl-research: simulation framework for accelerating research in Private Federated Learning
CoRR(2024)
摘要
Federated learning (FL) is an emerging machine learning (ML) training
paradigm where clients own their data and collaborate to train a global model,
without revealing any data to the server and other participants. Researchers
commonly perform experiments in a simulation environment to quickly iterate on
ideas. However, existing open-source tools do not offer the efficiency required
to simulate FL on larger and more realistic FL datasets. We introduce
pfl-research, a fast, modular, and easy-to-use Python framework for simulating
FL. It supports TensorFlow, PyTorch, and non-neural network models, and is
tightly integrated with state-of-the-art privacy algorithms. We study the speed
of open-source FL frameworks and show that pfl-research is 7-72× faster
than alternative open-source frameworks on common cross-device setups. Such
speedup will significantly boost the productivity of the FL research community
and enable testing hypotheses on realistic FL datasets that were previously too
resource intensive. We release a suite of benchmarks that evaluates an
algorithm's overall performance on a diverse set of realistic scenarios. The
code is available on GitHub at https://github.com/apple/pfl-research.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要