Computation Offloading for IoT in C-RAN: Optimization and Deep Learning

IEEE Transactions on Communications(2020)

引用 26|浏览37
暂无评分
摘要
We consider computation-offloading for Internet-of-things (IoT) applications in multiple-input-multiple-output (MIMO) cloud-radio-access-network (C-RAN). Specifically, the computational tasks of the IoT devices (IoTDs) are offloaded to a MIMO C-RAN, where a MIMO radio resource head (RRH) is connected to a baseband unit (BBU) through a capacity-limited fronthaul link, facilitated by the spatial filtering and uniform scalar quantization. We formulate a computation-offloading optimization problem to minimize the total transmit power of the IoTDs while satisfying the latency requirement of the computational tasks. To obtain a feasible solution for the non-convex problem, firstly the spatial filtering matrix is locally optimized at the MIMO RRH. Subsequently, leveraging the alternating optimization framework for joint optimization on the residual variables at the BBU, the baseband combiner, the optimal resource allocation and the number of quantization bits are obtained through the minimum-mean-squared-error (MMSE) metric, the successive inner convexification method and the line-search method, respectively. As a low-complexity approach, we apply a supervised deep learning (DL) method, which learns from the solutions obtained with our proposed algorithm. In addition, the deep transfer learning is adopted to adjust the neural network in dynamic IoT systems. Numerical results validate the effectiveness of the proposed optimization algorithm and the learning based methods.
更多
查看译文
关键词
IoT,computation-offloading,C-RAN,deep learning,deep transfer learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要