Decision-making through integration of sensory evidence at prolonged timescales

biorxiv(2018)

引用 1|浏览0
暂无评分
摘要
When multiple pieces of information bear on a decision, the best approach is to combine the evidence provided by each one. Evidence integration models formalize the computations underlying this process [[1][1]–[3][2]], explain human perceptual discrimination behavior [[4][3]–[11][4]], and correspond to neuronal responses elicited by discrimination tasks [[12][5]–[17][6]]. These findings indicate that evidence integration is key to understanding the neural basis of decision-making [[18][7]–[21][8]]. Evidence integration has most often been studied with simple tasks that limit the timescale of deliberation to hundreds of milliseconds, but many natural decisions unfold over much longer durations. Because neural network models imply acute limitations on the timescale of evidence integration [[22][9]–[26][10]], it is unknown whether current computational insights can generalize beyond rapid judgments. Here, we introduce a new psychophysical task and report model-based analyses of human behavior that demonstrate evidence integration at long timescales. Our task requires probabilistic inference using brief samples of visual evidence that are separated in time by long and unpredictable gaps. We show through several quantitative assays how decision-making can approximate a normative integration process that extends over tens of seconds without accruing significant memory leak or noise. These results support the generalization of evidence integration models to a broader class of behaviors while posing new challenges for models of how these computations are implemented in biological networks. [1]: #ref-1 [2]: #ref-3 [3]: #ref-4 [4]: #ref-11 [5]: #ref-12 [6]: #ref-17 [7]: #ref-18 [8]: #ref-21 [9]: #ref-22 [10]: #ref-26
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要