A Hierarchical Generative Model Of Recurrent Object-Based Attention In The Visual Cortex

ICANN'11: Proceedings of the 21th international conference on Artificial neural networks - Volume Part I(2011)

引用 11|浏览51
暂无评分
摘要
In line with recent work exploring Deep Boltzmann Machines (DBMs) as models of cortical processing, we demonstrate the potential of DBMs as models of object-based attention, combining generative principles with attentional ones. We show: (1) How inference in DBMs can be related qualitatively to theories of attentional recurrent processing in the visual cortex; (2) that deepness and topographic receptive fields are important for realizing the attentional state; (3) how more explicit attentional suppressive mechanisms can be implemented, depending crucially on sparse representations being formed during learning.
更多
查看译文
关键词
attentional recurrent processing,attentional state,explicit attentional suppressive mechanism,cortical processing,Deep Boltzmann Machines,generative principle,object-based attention,recent work,sparse representation,topographic receptive field,hierarchical generative model,recurrent object-based attention,visual cortex
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要