Using LSTM Neural Networks as Resource Utilization Predictors: The Case of Training Deep Learning Models on the Edge

GECON(2020)

引用 4|浏览16
暂无评分
摘要
Cloud and Fog technologies are steadily gaining momentum and popularity in the research and industry circles. Both communities are wondering about the resource usage. The present work aims to predict the resource usage of a machine learning application in an edge environment, utilizing Raspberry Pies. It investigates various experimental setups and machine learning methods that are acting as benchmarks, allowing us to compare the accuracy of each setup. We propose a prediction model that leverages the time series characteristics of resource utilization employing an LSTM Recurrent Neural Network (LSTM-RNN). To conclude to a close to optimal LSTM-RNN architecture we use a genetic algorithm. For the experimental evaluation we used a real dataset constructed by training a well known model in Raspberry Pies3. The results encourage us for the applicability of our method.
更多
查看译文
关键词
Resource utilization,Edge computing,Long short-term memory,Deep learning,Genetic algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要