Milne, A. , Petkov, C. & Wilson, B.
Newcastle University, Newcastle upon Tyne, United Kingdom.
Language allows humans to communicate using different sensory modalities. Although it has been argued that nonhuman primate communication is inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks can be used to emulate ordering relationships between words in a sentence. However, many comparative AGL studies investigate only a single sensory modality, and direct comparisons to human behaviour are missing. We used an AGL paradigm to evaluate how humans and macaque monkeys learn and respond to identically structured sequences of either auditory or visual stimuli. In the auditory and visual experiments, we found that both species were sensitive to the local predictive dependencies present within these sequences. Moreover, humans and monkeys gave largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed comparably across the sensory modalities. These results provide evidence that human sequencing abilities stem from an evolutionarily conserved capacity for sequence processing that appears to be as multisensory in nonhuman primates as it is in humans. The findings set the stage for future neurobiological studies to investigate the multisensory nature of sequencing processes in nonhuman primates and how they compare to related processes in humans.