谷歌浏览器插件
订阅小程序
在清言上使用

A Deep Reinforcement Learning-based Task Scheduling Algorithm for Energy Efficiency in Data Centers

International Conference on Computer Communications and Networks(2021)

引用 4|浏览27
暂无评分
摘要
Cloud data centers provide end-users with a wide range of application scenarios, including scientific computing, smart grids, etc. The number and size of data centers have rapidly increased in recent years, which causes severe environmental problems and colossal power demand. Therefore, it is desirable to use a proper scheduling method to optimize resource usage and reduce energy consumption in a data center. However, it is rather difficult to design an effective and efficient task scheduling algorithm because of the dynamic and complex environment of data centers. This paper proposes a task scheduling algorithm, WSS, to optimize resource usage and reduce energy consumption based on a model-free deep reinforcement learning framework inspired by the Wolpertinger architecture. The proposed algorithm can handle the scheduling problem on a sizeable discrete action space, improve decision efficiency, and save the training convergence time. Meanwhile, the proposed algorithm based on Soft Actor-Critic is designed to improve the stability and exploration capability of WSS. Experiments based on real-world traces prove that WSS can reduce energy consumption by nearly 25% compared with the Deep Q-network task scheduling algorithm. Moreover, WSS can provide a short time of training convergence without increasing the average waiting time of tasks and achieve stable performance.
更多
查看译文
关键词
Cloud computing,energy efficiency,task scheduling,deep reinforcement learning,wolpertinger architecture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要