谷歌浏览器插件
订阅小程序
在清言上使用

Lexically-guided perceptual learning does generalize to new phonetic contexts.

J. Phonetics(2021)

引用 4|浏览7
暂无评分
摘要
Lexically-guided and visually-guided perceptual learning have been argued to tap into the same general perceptual mechanism. Using the visually-guided paradigm, some have argued that the resulting retuning effect is specific to the phonetic context in which it is learned; which in turn has been used to argue that such retuning targets context-dependent sub-lexical units. We use three new experiments to study the generalizable nature of lexically-guided perceptual learning of fricative consonants and how type variation in the training stimuli affects it. In contrast to visually-guided retuning, we show that lexical retuning does generalize to new phonetic contexts, particularly when listeners are trained with type variation. This suggests that there is an abstract context-independent representation that is used in speech perception and during lexical retuning. While the same generalization is not clearly observed when type variation is eliminated, the lack of a clear interaction effect between training types prevents us from inferring that lexically-guided perceptual learning needs type variation within the training stimuli to generalize to new phonetic contexts. Furthermore, we point out that some of these effects are subtle and are only observable if we take into account pre-training group difference between the control and test groups.
更多
查看译文
关键词
Speech perception,Perceptual learning,Lexical retuning,Prelexical processing,Type variation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要