HotKey-Cache: A HotKey-Separate Cache to Improve Performance for Key-Value Store

Jinkang Lu,Ping Xie

2023 2nd International Conference on Cloud Computing, Big Data Application and Software Engineering (CBASE)(2023)

引用 0|浏览0
暂无评分
摘要
In most key-value stores, data access patterns are typically non-uniform and display skewness, resulting in the presence of hotkeys and coldkeys. The caching mechanism of key-value stores, which relies on the Least Recently Used (LRU) policy like LRUCache, only accounts for the temporal relevance of the data. In addition, the caching mechanism based on the Least Frequently Used (LFU) policy like LFUCache, only considers the frequency of data access. The limitations of these caching mechanisms lead to a reduction in cache efficiency. This paper presents a novel caching mechanism called HotKey-Cache, which focuses on separating hotkeys. Building upon LRUCache, HotKey-Cache separate hotkeys and assign them to a specialized queue to prevent their direct eviction. This design simultaneously considers both the frequency of data access and the temporal relevance of the data. Finally, we deploy these caches in LevelDB. Experimental results indicated a 12.67% enhancement compared to LRUCache and a 28.52% improvement compared to LFUCache.
更多
查看译文
关键词
key-value store,cache,frequency of access,temporal relevance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要