Domain general frontoparietal regions show modality-dependent coding of auditory and visual rules

biorxiv(2024)

引用 0|浏览8
暂无评分
摘要
A defining feature of human cognition is our ability to respond flexibly to what we see and hear, changing how we respond depending on our current goals. In fact, we can rapidly associate almost any input stimulus with any arbitrary behavioural response. This remarkable ability is thought to depend on a frontoparietal 'multiple demand' circuit which is engaged by many types of cognitive demand and widely referred to as domain general. However, it is not clear how responses to multiple input modalities are structured within this system. Domain generality could be achieved by holding information in an abstract form that generalises over input modality, or in a modality-tagged form, which uses similar resources but produces unique codes to represent the information in each modality. We used a stimulus-response task, with conceptually identical rules in two sensory modalities (visual and auditory), to distinguish between these possibilities. Multivariate decoding of functional magnetic resonance imaging data showed that representations of visual and auditory rules recruited overlapping neural resources but were expressed in modality-tagged non-generalisable neural codes. Our data suggest that this frontoparietal system may draw on the same or similar resources to solve multiple tasks, but does not create modality-general representations of task rules, even when those rules are conceptually identical between domains. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要