Itzuli Aurreko ekintzak: Peter beim Graben & Serafim Rodrigues. Analyzing, Cognitive, and Neural Modeling of Language-Related Brain Potentials

Peter beim Graben & Serafim Rodrigues. Analyzing, Cognitive, and Neural Modeling of Language-Related Brain Potentials

2017/5/4
- BCBL Auditorium
What: Analyzing, Cognitive, and Neural Modeling of Language-Related Brain Potentials

 

Where: BCBL Auditorium

 

Who: Peter beim Graben, Bernstein Center for Computational Neuroscience Berlin, Germany. Serafim Rodrigues, BCAM, Bilbao, Spain.

 

When: 12:00 PM noon

How is the human language faculty neurally implemented in the brain? What are the neural correlates of linguistic computations? To which extent are neuromorphic cognitive architectures feasible and could they eventually lead to new diagnosis and treatment methods in clinical linguistics (such as linguistic prosthetics)? These questions interfacing neurolinguistics with computational linguistics and computational neuroscience are addressed by the emergent discipline of computational neurolinguistics. In my presentation I will give an overview about my own research in computational neurolinguistics in the framework of language-related brain potentials (ERPs). By means of a paradigmatic ERP experiment for the processing and resolution of local ambiguities in German [1], I first introduce a novel method to identifying ERP components such as the P600 as “recurrence structures” in neuronal dynamics [2]. In a second step, I use a neuro-computational approach, called “neural automata” [3] in order to construct a context-free “limited repair parser” [3,4] for processing the linguistic stimuli of the study. Finally, I demonstrate how the time-discrete evolution of the automaton can be embedded into continuous time using winner-less competition in neural population models [2,5]. This leads to a representation of the automaton’s configurations as recurrence structures in the neural network that can be correlated with experimentally measured ERPs through subsequent statistical modeling [6,7]. Further extensions toward usage-based grammar [8] and biophysical observation models [9] will be indicated.