The Granularity Gap Problem: A Hurdle for Applying Approximate Memory to Complex Data Layout

ICPE(2021)

引用 1|浏览5
暂无评分
摘要
ABSTRACTThe main memory access latency has not much improved for more than two decades while the CPU performance had been exponentially increasing until recently.Approximate memory is a technique to reduce the DRAM access latency in return of losing data integrity. It is expected to be beneficial for applications that are robust to noisy input and intermediate data such as artificial intelligence, image/video processing, and big-data analytics. To obtain reasonable outputs from applications on approximate memory, it is crucial to protect critical data while accelerating accesses to non-critical data. We refer the minimum size of a continuous memory region that the same error rate is applied in approximate memory to as the approximation granularity. A fundamental limitation of approximate memory is that the approximation granularity is as large as a few kilo bytes. However, applications may have critical and non-critical data interleaved with smaller granularity. For example, a data structure for graph nodes can have pointers (critical) to neighboring nodes and its score (non-critical, depending on the use-case). This data structure cannot be directly mapped to approximate memory due to the gap between the approximation granularity and the granularity of data criticality. We refer to this issue as the granularity gap problem. In this paper, we first show that many applications potentially suffer from this problem. Then we propose a framework to quantitatively evaluate the performance overhead of a possible method to avoid this problem using known techniques.The evaluation results show that the performance overhead is non-negligible compared to expected benefit from approximate memory,suggesting that the granularity gap problem is a significant concern.
更多
查看译文
关键词
approximate memory, memory systems, performance analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要