谷歌浏览器插件
订阅小程序
在清言上使用

An Efficient Parallel Self-Attention Transformer for Csi Feedback

Ziang Liu,Tianyu Song, Ruohan Zhao,Jiyu Jin,Guiyue Jin, Lei Fan

Physical Communication(2024)

引用 0|浏览4
暂无评分
摘要
In massive multi-input multi-output (MIMO) systems, it is necessary for user equipment (UE) to transmit downlink channel state information (CSI) back to the base station (BS). As the number of antennas increases, the feedback overhead of CSI consumes a significant amount of uplink bandwidth resources. To minimize the bandwidth overhead, we propose an efficient parallel attention transformer, called EPAformer, a lightweight network that utilizes the transformer architecture and efficient parallel self-attention (EPSA) for CSI feedback tasks. The EPSA expands the attention area of each token within the transformer block effectively by dividing multiple heads into parallel groups and conducting self-attention in horizontal and vertical stripes. The proposed EPSA achieves better feature compression and reconstruction. The simulation results display that the EPAformer surpasses previous deep learning-based approaches in terms of reconstruction performance and complexity.
更多
查看译文
关键词
CSI feedback,Massive MIMO,Self-attention,Transformer,Deep learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要