When: 12:00PM, noon.
Recent evidence suggests that brain rhythms track acoustic envelope in auditory speech (speech entrainment) and this mechanism facilitates speech intelligibility. We recently demonstrated that this is the case for visual speech (lip movement) as well. This has led us to ask to what extent auditory and visual information are represented in brain areas, either jointly or individually. In my talk, I will present our recent work which shows how information in entrained auditory and visual speech interact to be led to a unified percept of audiovisual speech. Here we used a novel Information Theory approach to decompose dynamic information quantities. I will briefly give an overview of measures of Information Theory and explain why they are useful in multivariate neuroimaging studies. In addition, I will discuss our recent advance in the question of linking function to anatomy via diffusion tensor imaging that studied whether the degree of white matter integrity in individuals differently predict speech entrainment/information interaction that we observed.