Federated Learning over Time-Varying Channels

2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)(2021)

引用 3|浏览18
暂无评分
摘要
We study distributed machine learning (ML) sys-tems where independent workers compute local gradients based on their local datasets and send them to a parameter server (PS) through a time-varying multipath fading multiple access channel (MAC) via orthogonal frequency-division multiplexing (OFDM). We assume that the workers do not have channel state information (CSI), and hence the PS employs multiple antennas to remove the fading effects. Time variations in the wireless channel result in inter-carrier interference (ICI), which has a detrimental effect on the performance of OFDM systems, especially when the channel variations are rapid. To examine the effects of channel variations on federated learning systems, we perform an analysis of the interference in the aggregate gradient term at the PS due to Doppler, and show that the undesired effects caused by them are limited. Specifically, the ICI term becomes insignificant for slow to moderate time variations. We also validate our theoretical expectations via simulations and demonstrate that the destructive effect of ICI can be alleviated for moderate level of channel variations.
更多
查看译文
关键词
Federated learning, distributed machine learning, stochastic gradient descent, time-varying multipath fading MAC, OFDM, Doppler spread
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要