谷歌浏览器插件
订阅小程序
在清言上使用

Orthographic transparency modulates the grain size of orthographic processing: behavioral and ERP evidence from bilingualism.

Brain Research(2013)

引用 29|浏览2
暂无评分
摘要
Grapheme-to-phoneme mapping regularity is thought to determine the grain size of orthographic information extracted whilst encoding letter strings. Here we tested whether learning to read in two languages differing in their orthographic transparency yields different strategies used for encoding letter-strings as compared to learning to read in one (opaque) language only. Sixteen English monolingual and 16 early Welsh–English bilingual readers undergoing event-related brain potentials (ERPs) recordings were asked to report whether or not a target letter displayed at fixation was present in either a nonword (consonant string) or an English word presented immediately before. Bilinguals and monolinguals showed similar behavioural performance on target detection presented in words and nonwords, suggesting similar orthographic encoding in the two groups. By contrast, the amplitude of ERPs locked to the target letters (P3b, 340–570ms post target onset, and a late frontal positive component 600–1000ms post target onset) were differently modulated by the position of the target letter in words and nonwords between bilinguals and monolinguals. P3b results show that bilinguals who learnt to read simultaneously in an opaque and a transparent orthographies encoded orthographic information presented to the right of fixation more poorly than monolinguals. On the opposite, only monolinguals exhibited a position effect on the late positive component for both words and nonwords, interpreted as a sign of better re-evaluation of their responses. The present study shed light on how orthographic transparency constrains grain size and visual strategies underlying letter-string encoding, and how those constraints are influenced by bilingualism.
更多
查看译文
关键词
Bilingualism,Reading,Orthographic processing grain size,Visual attention span,Orthographic transparency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要