Grid Codes versus Multi-Scale, Multi-Field Place Codes for Space

Robin Dietrich, Nicolai Waniek, Martin Stemmler,Alois Knoll

biorxiv(2023)

引用 0|浏览7
暂无评分
摘要
Recent work on bats flying over long distances has revealed that single hippocampal cells represent space on many scales, suggesting that these cells simultaneously participate in multiple neuronal networks to yield a multi-scale, multi-field place cell code. While the first theoretical analyses revealed that this code outperforms classical single-scale, single-field place codes, it remains an open question what the performance boundaries of this code are, what functional properties the network responsible for this code has, and how it compares to a highly regular grid code, in which cells form distinct modules, each with its own attractor dynamics on the network. In this paper we address these questions with rigorous analyses of comprehensive simulations. Specifically, we perform an evolutionary optimization of several multi-scale, multi-field place cell networks and compare the results against a single-scale, single-field as well as against a simple grid code. We focus on two main characteristics: the general performance of the code itself and the dynamics of the network generating it. Our simulation experiments show that, under normal conditions, the grid code easily outperforms any multi-scale, multi-field place code with respect to decoding accuracy. However, the latter is more robust to noise and lesions, such as drop-out. The robustness comes at a cost, as the grid code requires a significantly smaller number of neurons and fields per neuron. Further analyses of the network dynamics also revealed that the proposed topology of multi-scale, multi-field place cells does not, in fact, result in a continuous attractor network. More precisely, the simulated networks do not maintain activity bumps without position specific input. The multi-scale, multi-field code, therefore, seems to be a compromise between a place code and a grid code that invokes a trade-off between accurate positional encoding and robustness. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要