Special Issue on In-Memory Computing

IEEE Micro(2022)

引用 1|浏览8
暂无评分
摘要
The articles in this special section focuses on in-memory computing. Computer designers have traditionally separated the role of storage and compute units. Memories and caches stored data. Processors’ logic units computed them. Is this separation necessary? A human brain does not separate the two so distinctly. Why should a processor? In-/near-memory computing paradigm blurs this distinction and imposes the dual responsibility on memory substrates: storing and computing on data. Modern processors and accelerators have over 90% of their aggregate silicon area dedicated to memory. In-/near-memory processing converts these memory units into powerful allies for massively parallel computing, which can accelerate a plethora of applications including neural networks,
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要