Novel Approach towards a Fully Deep Learning-Based IoT Receiver Architecture: From Estimation to Decoding

Future Internet(2024)

引用 0|浏览0
暂无评分
摘要
As the Internet of Things (IoT) continues to expand, wireless communication is increasingly widespread across diverse industries and remote devices. This includes domains such as Operational Technology in the Smart Grid. Notably, there is a surge in resource-constrained devices leveraging wireless communication, especially with the advances of 5G/6G technology. Nevertheless, the transmission of wireless communications demands substantial power and computational resources, presenting a significant challenge to these devices and their operations. In this work, we propose the use of deep learning to improve the Bit Error Rate (BER) performance of Orthogonal Frequency Division Multiplexing (OFDM) wireless receivers. By improving the BER performance of these receivers, devices can transmit with less power, thereby improving IoT devices’ battery life. The architecture presented in this paper utilizes a depthwise Convolutional Neural Network (CNN) for channel estimation and demodulation, whereas a Graph Neural Network (GNN) is utilized for Low-Density Parity Check (LDPC) decoding, tested against a proposed (1998, 1512) LDPC code. Our results show higher performance than traditional receivers in both isolated tests for the CNN and GNN, and a combined end-to-end test with lower computational complexity than other proposed deep learning models. For BER improvement, our proposed approach showed a 1 dB improvement for eliminating BER in QPSK models. Additionally, it improved 16-QAM Rician BER by five decades, 16-QAM LOS model BER by four decades, 64-QAM Rician BER by 2.5 decades, and 64-QAM LOS model BER by three decades.
更多
查看译文
关键词
IoT,5G,operational technology,OFDM,receiver,deep learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要