Noam Chomsky‘s political views attract so much attention that it’s easy to forget he’s a scientist, perhaps one of the most influential who ever lived.
Beginning in the 1950s, Chomsky made the case that all humans possess an innate capacity for language. He says that humans will learn to speak from minimal environmental stimuli. Chomsky’s ideas have profoundly affected linguistics and the science of the mind ever since. There have been many attacks against Chomsky’s ideas, such as Tom Wolfe’s widely criticized A Kingdom of Speech, where he asserts that both Darwin and “Noam Charisma” were wrong. (See journalist Charles Mann’s evisceration of Wolfe.)
Other critiques have a better basis in science. In “Language in a New Key,” in the November Scientific American, Paul Ibbotson and Michael Tomasello contend that “much of Noam Chomsky’s revolution in linguistics, including its account of the way we learn languages, is being overturned.” The online headline says “Evidence Rebuts Chomsky’s Theory of Language Learning.” Ibbotson and Tomasello propose that children acquire language via “general cognitive abilities and the reading of other people’s intentions.”
Steven Pinker, the cognitive scientist, psychologist, linguist, and popular science author, has weighed in on the recent debats. He has written several acclaimed books on language, notably The Language Instinct (1994) and The Stuff of Thought (2007).
In an emailed response to a journalist from Scientific American, Pinker defends Chomsky, sort of. Here are his comments in full:
None of this is new—the same debate has been playing out for fifty years. Contrary to the claim that Chomsky’s theories are an orthodoxy, dominant approach, or consensus (which makes any challenger into a giant-slayer), they have never been anything close to the default in the sciences of language. I’d say that perhaps Chomsky’s theory (at any time) has attracted a plurality of linguists, but probably never a majority, since there have always been rival theories (Generative Semantics, Cognitive Grammar, Relational Grammar, Lexical Functional Grammar, Generalized Phrase Structure Grammar, and, perhaps for most, just no commitment to any overarching theory at all). In other fields, Chomsky’s theories were only ever embraced by a small minority, and the opposition has always been fierce: from philosophers like Putnam, Goodman, Searle, and Dennett in the 1960s and 1970s; from Jerome Bruner and Piagetian developmental psychologists in the 1970s; from Good Old-Fashioned AI researchers like Terry Winograd, Roger Schanck, and Marvin Minsky in the 1970s; from the connectionist psychologists and neural-network AI researchers in the 1980s; from “dynamic systems theorists” like Linda Smith in 1980s; probably from most child language acquisition researchers in every decade.
The misconception that Chomsky represents the dominant view comes from the fact that the opposition is divided into many approaches and factions, so there’s no single figure that can be identified with an alternative. Also, he’s famous and charismatic, and people outside the field have heard of him, but haven’t heard of anyone else, and confuse his fame with professional dominance.
Another problem with the claim that Chomsky’s theory of language “is being overturned” (as if it had ever been accepted, which is not true), is that it’s not clear what “Chomsky’s theory of language” refers to. He has proposed a succession of technical theories in syntax, and at the same time has made decades of informal remarks about language being innate, which have changed over the decades, and have never been precise enough to confirm or disconfirm. And it’s not so easy to say what “Universal Grammar” or an innate “language faculty” consists of; it’s necessarily abstract, since the details of any particular language, like Japanese or English, are uncontroversially learned. So for 50 years Chomsky has been a piñata, where anyone who finds some evidence that some aspect of language is learned (and there are plenty), or some grammatical phenomenon varies from language to language, claims to have slain the king. It has not been a scientifically productive debate, unfortunately.
My own view is that we need to create precise computational models of the language acquisition process – sentences in, grammar out – and see if they succeed in mastering the structure of any language whose sentences are fed into it, in a way that resembles the way children do it. Then whatever is in that model is the best theory of the child’s innate learning abilities. Every now and again someone will try to do that (I did in my first book, Language Learnability and Language Development, in 1984.) Failing that, it’s all too easy to claim that children don’t need any innate priors or assumptions or representations, only to sneak them back in when it comes to get serious and implement a model. That was the trick in a lot of the neural-network models of language that were popular in the 80s and 90s – when the rubber met the road, they always built in innate structure without calling attention to it. That’s what I suspect will be true of models based on the current ideas. – Steven Pinker
Since you're here...
... we have a small favor to ask. More people are reading Ideapod than ever before, but advertising revenues across the media landscape are falling fast. You see, it takes literally hundreds of hours (and thousands of dollars) per month to create our articles and maintain the spaces online where the community comes together to find inspiration.
If you find any value and inspiration in what we do, we would be so appreciative if you became an Ideapod Prime member of Ideapod. It's $4 per month, with a free 14 day trial. You get to experience Ideapod without advertisements, and will also be invited to a private group where you get direct access to the Ideapod team.
Includes: As part of your membership, you get access to a Prime-only eBook every month, an ad-free site and warm, fuzzy feelings from supporting independent journalism.
It takes only 1 minute to join, and you can cancel anytime. You'll be making a huge difference. Thank you.