Trotter, A. S. , Frost, R. L. & Monaghan, P.
Lancaster University
Hierarchical centre embeddings (HCEs) in natural language (e.g. The cat the dog chased ran) have been taken as evidence that language is not a finite state system. Whilst phrase structure may be necessary to produce HCEs, sequential processing may underpin their comprehension, with cues to dependencies, such as pitch and rhythm variation and semantic similarities, providing critical support for dependency detection. Past research using artificial languages has encountered difficulties in demonstrating acquisition of phrase structure grammar. However, the languages used for these studies seldom feature the natural language cues that facilitate learning using finite state computations.
Eighty adults were trained on an artificial grammar containing HCEs, each assigned to one of five conditions incorporating natural language cues: similarity, rhythm, pitch, combined (similarity+rhythm+pitch) and no cues. There were 12 blocks of training, where participants received 2 minutes of familiarisation, then performed a grammaticality classification task on novel sequences. Early blocks produced chance performance, the combined condition was most accurate in intermediate blocks, and similarity cues were most accurate in later blocks. With more experience, individual linguistic dependency cues increase in usefulness. Thus, multiple cues support learning HCEs, suggesting that the acquisition of phrase structure may be driven by finite state computations.