Demonstration of Federated Learning in a Resource-Constrained Networked Environment

2019 IEEE International Conference on Smart Computing (SMARTCOMP)(2019)

引用 34|浏览43
暂无评分
摘要
Many modern applications in the area of smart computing are based on machine learning techniques. To train machine learning models, a large amount of data is usually required, which is often not readily available at a central location. Federated learning enables the training of machine learning models from distributed datasets at client devices without transmitting the data to a central place, which has benefits including preserving the privacy of user data and reducing communication bandwidth. In this demonstration, we show a federated learning system deployed in an emulated wide-area communications network with dynamic, heterogeneous, and intermittent resource availability, where the network is emulated using a CORE/EMANE emulator. In our system, the environment is decentralized and each client can ask for assistance by other clients. The availability of clients is intermittent so only those clients that are available can provide assistance. A graphical interface illustrates the network connections and the user can adjust these connections through the interface. A user interface displays the training progress and each client's contribution to training.
更多
查看译文
关键词
Distributed machine learning,federated learning,model training,networking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要