Fractal Structure in Hokusai’s “Great Wave” and the Memory Neural Network

Advances in Cognitive Neurodynamics (VII)Advances in Cognitive Neurodynamics(2021)

引用 0|浏览0
暂无评分
摘要
Google used 10 million natural images as input information and performed self-organized learning with a huge neural network with 10 billion synapses, and neurons with a receptive field resembling a cat’s image appeared in the upper layer. Hokusai drew “Great Wave” by using his memory with a fractal structure. Which do you think is “beautiful”: “Google’s cat picture” and Hokusai’s “Great Wave”? I think Hokusai’s one is beautiful. Because it is based on stunning information compression. The proposed network in this paper is composed of a one-layer artificial neural network with feedforward and feedback connections. In the feedforward connections, the spatiotemporal learning rule (STLR) Tsukada et al. (, ) has high ability in pattern separation and in the recurrent connections, Hebbian learning rule (HEB) in pattern completion. The interaction between the two rules plays an important role to self-organize the context-dependent attractor in the memory network. The context-dependent attractors depend on the balance between STLR and HEB. The structure is an important factor of memory networks to hierarchically embed a sequence of events.
更多
查看译文
关键词
great wave”,hokusais,memory neural network,neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要