Learning When and How to Forget in Variable Window LFU Caching.

GLOBECOM (Workshops)(2023)

引用 0|浏览0
暂无评分
摘要
In this paper, we introduce a novel caching approach to improve content delivery, particularly for local small-cell access points with limited computational and memory resources. We focus on improving cache hit ratios and minimizing the load on the fronthaul link with the aid of a change point detection module. Importantly, the whole process is done locally in the context of emerging self-organizing networks and preserving privacy. We present a modified least frequently used (LFU) caching strategy, called variable window LFU (VW-LFU), which adapts to changes in content popularity. We introduce a non-parametric change point detection method, which recognizes changes in the traffic distribution with less than a 12% delay and prompts the VW-LFU system to adjust its window size. This approach mitigates biases from previous environments, improving cache decisions and fronthaul load. We prove that in an environment with several countable stationary modes, the average regret with our algorithm can be as low as $O$ (1). Implemented in a four-room conference center with nine access points, our results show more than a 40% improvement in the average hit ratio, in low-capacity regimes, demonstrating the effectiveness of our strategy. We note that VW-LFU has an efficient implementation of $O$ (1) with the aid of hashtable and linked frequency lists.
更多
查看译文
关键词
Least Frequency Used Caching,Online Change Point Detection,Small Cell,Fronthaul Links
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要