Privacy-Aware Edge Computing Based on Adaptive DNN Partitioning

IEEE Global Communications Conference(2019)

引用 17|浏览33
暂无评分
摘要
Recent years have witnessed deep neural networks (DNNs) become the de facto tool in many applications such as image classification and speech recognition. But significant unmet needs remain in performing DNN inference tasks on mobile devices. Although edge computing enables complex DNN inference tasks to be performed in close proximity to the mobile device, performance optimization requires a carefully designed synergy between the edge and the mobile device. Moreover, the confidentiality of uploaded data to the possibly untrusted edge server is of great concern. In this paper, we investigate the impact of DNN partitioning on the inference latency performance and the privacy risks in edge computing. Based on the obtained insights, we design an offloading strategy that adaptively partitions the DNN in varying network environments to make the optimal tradeoff between performance and privacy for battery-powered mobile devices. This strategy is designed under the learning-aided Lyapunov optimization framework and has a provable performance guarantee. Finally, we build a small-scale testbed to demonstrate the efficacy of the proposed offloading scheme.
更多
查看译文
关键词
privacy-aware edge,adaptive DNN,deep neural networks,facto tool,image classification,speech recognition,mobile device,edge computing,complex DNN inference tasks,performance optimization,carefully designed synergy,possibly untrusted edge server,DNN partitioning,inference latency performance,privacy risks,adaptively partitions,network environments,battery-powered mobile devices,learning-aided Lyapunov optimization framework,provable performance guarantee
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要