Resource-Efficient Federated Learning With Non-IID Data: An Auction Theoretic Approach

IEEE Internet of Things Journal(2022)

引用 7|浏览31
暂无评分
摘要
Federated learning (FL) has gained significant importance for intelligent applications, following data produced on a massive scale by numerous distributed IoT devices. From an FL perspective, the key aspect is that this data is not identically and independently distributed (IID) across different data sources and locations. This distribution-skewness leads to significant quality degradation. Moreover, an intrinsic consequence of using such non-IID data in decentralized learning is increasing costs that would be mitigated if using IID data. As a remedy, we propose a resource-efficient method for training an FL-based application with non-IID data, effectively minimizing cost through an auction approach and mitigating quality degradation through data sharing. In an experimental evaluation, we investigate the FL performance using real-world non-IID data and use the resulting ground-truth outputs to develop functions for estimating the utility of non-IID data, computational resource costs, and data generation costs. These functions are used to optimize the costs of model training, ensuring resource efficiency. It is further demonstrated that using shared-IID data significantly increases the resource efficiency of FL with local non-IID data. This holds true even when the shared IID data size is less than 1% of the size of the local non-IID data. Moreover, this work demonstrates that the profitability of the stakeholders can be maximized using the proposed auction procedure. The integration of the auction procedure and a resource-efficient training strategy allows FL service providers to create practical trading strategies by minimizing the FL clients’ resources and payments in a machine learning marketplace.
更多
查看译文
关键词
Auction theory,federated learning (FL),nonidentically and independently distributed (IID) data distribution,resource efficiency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要