When: 12 PM, noon.
Language happens in the here-and-now. Our memory for linguistic input is fleeting. New material rapidly obliterates previous material. How then can the brain deal successfully with the continual deluge of linguistic input? I argue that, to deal with this “Now-or-Never” bottleneck, the brain must incrementally compress and recode language input as rapidly as possible into increasingly more abstract of levels of linguistic representation. This perspective has profound implications for the nature of language processing, acquisition, and change. Focusing on language acquisition, I present a computational model that learns in a purely incremental fashion, through on-line processing of simple statistics, and offers broad, cross-linguistic coverage while uniting comprehension and production within a single framework. The model achieves strong performance across over 200 single-child corpora representing 29 languages from the CHILDES database. I conclude that the immediacy of language processing provides a fundamental constraint on accounts of language acquisition, implying that acquisition fundamentally involves learning to process, rather than inducing a grammar.