Optimal Sparse Approximation With Integrate And Fire Neurons

INTERNATIONAL JOURNAL OF NEURAL SYSTEMS(2014)

引用 61|浏览10
暂无评分
摘要
Sparse approximation is a hypothesized coding strategy where a population of sensory neurons (e. g. V1) encodes a stimulus using as few active neurons as possible. We present the Spiking LCA (locally competitive algorithm), a rate encoded Spiking Neural Network (SNN) of integrate and fire neurons that calculate sparse approximations. The Spiking LCA is designed to be equivalent to the nonspiking LCA, an analog dynamical system that converges on a l(1)-norm sparse approximations exponentially. We show that the firing rate of the Spiking LCA converges on the same solution as the analog LCA, with an error inversely proportional to the sampling time. We simulate in NEURON a network of 128 neuron pairs that encode 8x8 pixel image patches, demonstrating that the network converges to nearly optimal encodings within 20ms of biological time. We also show that when using more biophysically realistic parameters in the neurons, the gain function encourages additional l(0)-norm sparsity in the encoding, relative both to ideal neurons and digital solvers.
更多
查看译文
关键词
Sparse coding, spiking neural networks, locally competitive algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要