Kolozsvári, O. , Xu, W. & Hämäläinen, J. A.
University of Jyväskylä
During speech perception, listeners rely on multi-modal input and make use of both visual and auditory information.
We investigated how the familiarity of the presented stimuli affects brain responses to audio-visual speech. 9 participants (Finnish natives, right-handed, average age: 24,2 years (SD:3.33)) watched videos of a Chinese speaker pronouncing syllables (/pa/,/pha/,/ta/,/tha/,/fa/) during a MEG measurement. Their task was to press a button when presented with the /fa/ stimuli in visual, auditory of AV form. Half of the stimuli were familiar to participants, (/pa/ and /ta/) belonging to Finnish phonology, while the other half (/pha/ and /tha/) were unfamiliar.
We found significant differences between responses to syllables belonging to the Finnish phonology and those that do not. These results suggest that long term memory representations for speech sounds are manifested in the brain activity already around the 300-400 ms time window, calculated from the start of mouth movement in the stimuli and at the 500-600 ms time window as well (200-300 ms after start of the syllable in the stimuli).