Cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages — and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.

The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things.

These capabilities, coupled with a unique human ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.

This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature.

Chomsky’s first version of his theory, put forward in the mid-20th century, meshed with two emerging trends in Western intellectual life. First, he posited that the languages people use to communicate in everyday life behaved like mathematically based languages of the newly emerging field of computer science. His research looked for the underlying computational structure of language and proposed a set of procedures that would create “well-formed” sentences. The revolutionary idea was that a computerlike program could produce sentences real people thought were grammatical. That program could also purportedly explain as well the way people generated their sentences. This way of talking about language resonated with many scholars eager to embrace a computational approach to, well, everything.

As Chomsky was developing his computational theories, he was simultaneously proposing that they were rooted in human biology. In the second half of the 20th century, it was becoming ever clearer that our unique evolutionary history was responsible for many aspects of our unique human psychology, and so the theory resonated on that level as well. His universal grammar was put forward as an innate component of the human mind — and it promised to reveal the deep biological underpinnings of the world’s 6,000-plus human languages. The most powerful, not to mention the most beautiful, theories in science reveal hidden unity underneath surface diversity, and so this theory held immediate appeal.

But evidence has overtaken Chomsky’s theory, which has been inching toward a slow death for years. It is dying so slowly because, as physicist Max Planck once noted, older scholars tend to hang on to the old ways: “Science progresses one funeral at a time.” …

Some native Australian languages, such as Warlpiri, had grammatical elements scattered all over the sentence — noun and verb phrases that were not “neatly packaged” so that they could be plugged into Chomsky’s universal grammar — and some sentences had no verb phrase at all.

Home of Ellopos Blog