Synaptic Resistors for Concurrent Inference and Learning with High Energy Efficiency.

ADVANCED MATERIALS(2019)

引用 36|浏览7
暂无评分
摘要
The fastest supercomputer, Summit, has a speed comparable to the human brain, but is much less energy-efficient (approximate to 10(10 )FLOPS W-1, floating point operations per second per watt) than the brain (approximate to 10(15 )FLOPS W-1). The brain processes and learns from "big data" concurrently via trillions of synapses in parallel analog mode. By contrast, computers execute algorithms on physically separated logic and memory transistors in serial digital mode, which fundamentally restrains computers from handling "big data" efficiently. The existing electronic devices can perform inference with high speeds and energy efficiencies, but they still lack the synaptic functions to facilitate concurrent convolutional inference and correlative learning efficiently like the brain. In this work, synaptic resistors are reported to emulate the analog convolutional signal processing, correlative learning, and nonvolatile memory functions of synapses. By circumventing the fundamental limitations of computers, a synaptic resistor circuit performs speech inference and learning concurrently in parallel analog mode with an energy efficiency of approximate to 1.6 x 10(17 )FLOPS W-1, which is about seven orders of magnitudes higher than that of the Summit supercomputer. Scaled-up synstor circuits could circumvent the fundamental limitations in computers, and facilitate real-time inference and learning from "big data" with high efficiency and speed in intelligent systems.
更多
查看译文
关键词
carbon nanotube,concurrent inference and learning,high energy efficiency,parallelism,synaptic resistor
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要