Secure Federated Learning Across Heterogeneous Cloud and High-Performance Computing Resources – A Case Study on Federated Fine-tuning of LLaMA 2
Computing in Science & Engineering(2024)
摘要
Federated learning enables multiple data owners to collaboratively train
robust machine learning models without transferring large or sensitive local
datasets by only sharing the parameters of the locally trained models. In this
paper, we elaborate on the design of our Advanced Privacy-Preserving Federated
Learning (APPFL) framework, which streamlines end-to-end secure and reliable
federated learning experiments across cloud computing facilities and
high-performance computing resources by leveraging Globus Compute, a
distributed function as a service platform, and Amazon Web Services. We further
demonstrate the use case of APPFL in fine-tuning a LLaMA 2 7B model using
several cloud resources and supercomputers.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要