DRL360: 360-degree Video Streaming with Deep Reinforcement Learning

ieee international conference computer and communications(2019)

引用 146|浏览118
暂无评分
摘要
360-degree videos have gained more popularity in recent years, owing to the great advance of panoramic cameras and head-mounted devices. However, as 360-degree videos are usually in high resolution, transmitting the content requires extremely high bandwidth. To protect the Quality of Experience (QoE) of users, researchers have proposed tile-based 360-degree video streaming systems that allocate high/low bit rates to selected tiles of video frames for streaming over the limited bandwidth. It is challenging to determine which tiles should be allocated with a high/low rate, because (1) the video playbacks include too many features that dynamically change over time when making the rate allocation; (2) most of the state-of-the-art systems focus on a fixed set of heuristics to optimize a specific QoE objective, while users may have various QoE objectives that need to be optimized in different ways. This paper presents a Deep Reinforcement Learning (DRL) based framework for 360-degree video streaming, named DRL360. The DRL360 framework helps improve the system performance by jointly optimizing multiple QoE objectives across a broad set of dynamic features. The DRL-based model adaptively allocates rates for the tiles of the future video frames based on the observations collected by client video players. We compare the proposed DRL360 to the existing systems by trace-driven evaluations as well as conducting a realworld experiment over a wide variety of network conditions. Evaluation results reveal that DRL360 can adapt to all considered scenarios, and outperform the state-of-the-art approaches by 20%–30% on average given different QoE objectives.
更多
查看译文
关键词
Streaming media,Quality of experience,Resource management,Bandwidth,Adaptation models,Reinforcement learning,Bit rate
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要