When: 12 noon.
In face‐to‐face conversation, speech is produced and perceived through various modalities. Movements of the lips, jaw, and tongue, for instance, modulate air pressure to produce a complex waveform perceived by the listener’s ears. Visually salient articulatory movements (of the lips and jaw) also contribute to speech identification in acoustically degraded conditions and in non‐degraded conditions. Although many studies have been conducted on the role of visual components in speech perception, much less is known about their role in speech production. However, many studies have emphasized the important relationship between speech production and speech perception systems. If perceived visual and auditory cues are not independent but instead act in synergy and complement each other, they must be involved in the speech production process. In this talk, we explore the effects of auditory and visual feedback on speech production. In the first part, we describe how speech production mechanisms are affected when perceptual processes are altered, such as in sensory‐deprived conditions (deafness and blindness in adults). In the second part, we present a few developmental studies conducted in our lab in order to investigate the emergence of those sensori-motor links. We conclude by showing how the results can be transferred in clinical environments through a Living Lab.