Lindsay, S.
University of Hull
Accounts of embodied language comprehension have argued that language can automatically modulate attention and perception consistent with our embodied experience, e.g., hearing the word ?bird? should lead to an attention shift to our upper visual field, and activation of a perceptual simulation of the referent. This study used eye tracking combined with a novel word learning paradigm to test whether overt attention shifts occurred consistent with these claims. Participants learned a series of words that were associated with visual objects. During training, these objects systematically appeared in different locations. In a test task, ostensibly part of training, participants heard the label for an object followed by the object appearing, and had to verify whether a target object matched the novel word meaning. Objects and locations could either match or mismatch those of training. Unlike studies that have shown mismatch effects (slower reaction times for incongruent trials), participants were faster indicating a processing benefit at locations that matched their previous history. We also investigated anticipatory over attention shifts based on findings of the ?blank-screen paradigm?. The results shed new insight into the interface between language and visual-spatial processing, and show the benefits of word learning paradigms for investigating that relationship.