Wittgenstein’s Tractatus used an austere literary style containing no arguments as such, just declarative statements that are meant to be self-evident. I borrow from his style to present this criticism of Thornbury’s views on language learning.
The best way to explain the phenomena we want to understand is to propose an explanatory theory or hypothesis which is open to empirical tests of its veracity.
If tides is the phenomenon, one theory that explains it is gravity (tides are caused mostly by a differential gravitational force). If L1 acquistion is the phenomenon, one theory is UG (humans’ brains are hard wired for language).
We won’t ever prove that any general theory is true, because you can’t go from the particular to the general.
Just because the sun has always been observed to rise in the East and set in the West doesn’t prove that the theory which explains it is true, or assure us that the sun will rise in the East tomorrow.
If the sun doesn’t rise in the East tomorrow, then we’ll have evidence that the theory which predicted it would is false. If it does, we can hold on to the theory: it’s the best explanation so far.
So while we can’t prove theories are true, we can prove that they’re false, by finding data that contradict them.
If we find movements of tides that contradict the theory of gravity (e.g., the moon is furthest away from point X on earth when the tide is highest), or the theory of UG (e.g., language X doesn’t have UG principle Y, or child X has no knowledge of this principle) then the theory is falsified.
Attempts can be made to rescue theories from falsifying data.
In order to test theories, they need to be open to empirical (replicable) tests, and we need to use logic and criteria of rational argument (coherence and cohesion).
This provides a rough guide to how to evaluate conflicting explanations of how people learn languages.
How does Thornbury evaluate conflicting explanations of how people learn languages?
In reply to Chomsky’s Poverty of the Stimulus argument (children know things about language that can’t be inferred from the input they get) Thornbury says
you have to prove that aspects of syntax couldn’t have been acquired from input… otherwise it’s an ’empirically-empty’ assertion.
To support this assertion he cites Daniel Everett (2012) who writes
No one has proven that the poverty of the stimulus argument, or Plato’s Problem, is wrong. But nor has anyone shown that it is correct either. The task is daunting if anyone ever takes it up. One would have to show that language cannot be learned from available data. No one has done this. But until someone does, talk of a universal grammar or language instinct is no more than speculation.
Thornbury demands the impossible and the demand shows an elementary fault in logic.
Rather than demand proof that aspects of syntax couldn’t have been acquired from input (a logically impossible demand), it’s his task to come up with counter evidence.
Needless to say, the daft quote from Everett, whose work is now thoroughly discredited (as even Thornbury has recognised), suffers from the same fatal weakness.
This is not, I insist, a silly semantic quibble: Thornbury’s demand for proof of what can’t be proved here is an indication that his criticism is illogical. I repeat: illogical.
In the same discussion Thornbury says Surely the onus of proof is on the nativists … to show that the stimulus is impoverished?.
Again we have the “proof” thing. That aside, it’s certainly reasonable to ask the nativists to provide evidence to support their theory, and in fact the nativists give a shedload of evidence to support their claim that the stimulus doesn’t explain what children know about language.
Recall Thornbury’s reaction to my reply to Russ Mayne’s ridiculous remark Chomsky spurned empirical research. I said: Chomsky’s theory of UG has a long and thorough history of empirical research. Thornbury replied: “Chomsky’s theory of UG has a long and thorough history of empirical research”. What!!? Where? When? Who? This suggests that Thornbury has a very poor grasp of Chomsky’s work.
Given that his formal criticisms of UG is illogical, and given that he fails to appreciate how the theory is supported, the remaining question is: Does Thornbury show that the stimulus is, pace Chomsky, enough to explain language learning?
No, he doesn’t.
Thornbury has indicated here and there that he’s drawn to emergentist theories of language learning, but his attempts to make sense of emergentist arguments are hopeless.
Thornbury supports the fanciful, sweeping clap trap proposed by Larsen-Freeman, and shows about as much critical acumen when talking about emergentism as he does when talking about Chomsky.
Various scholars have been working on connectionist views and associative learning for over 30 years now.
Nick Ellis, and MacWhinney, for example, believe that the complexity of language emerges from relatively simple developmental processes being exposed to a massive and complex environment.
MacWhinney’s Competition Model is a good example of an emergentist approach which rejects the nativist UG account of language and puts forward what Gregg (2003) calls an “empiricist emergentist” approach.
Gregg thinks that empiricist emergentism has been most forcefully and accurately advocated in a series of articles by Nick Ellis (e.g., Ellis; 1998; 1999; 2002a; 2002b; 2003) (Gregg, 2003: 43).
William O’Grady and his associates have been working for many years now on an alternative emergentist view that Gregg calls “nativist emergentism”. O’Grady argues that the case for emergentism, but says that the need for certain types of innate concepts are required as all approaches to cognition recognize the existence of innately guided learning of some sort, and that there is a significant place for frequency in explanatory work on language but that its effects are modulated by an efficiency-driven processor.
Gregg gives this summary of UG versus emergentism:
“So the lines are drawn: On the one hand, we have mad dog nativist theories which posit a rich, innate representational system specific to the language faculty, and non-associative mechanisms, as well as associative ones, for bringing that system to bear on input to create an L2 grammar. On the other hand, we have the emergentist position, which denies both the innateness of linguistic representations (Chomsky modularity) and the domain-specificity of language learning mechanisms (Fodor-modularity (Gregg, 2003: 46).
Gregg says that at the root of the problem of any empiricist account is the poverty of the stimulus argument. Emergentists, by adopting an associative learning model and an empiricist epistemology (where some kind of innate architecture is allowed, but not innate knowledge, and certainly not innate linguistic representations) have a very difficult job explaining how children come to have the linguistic knowledge they do.
- How can general conceptual representations acting on stimuli from the environment explain the representational system of language that children demonstrate?
- How come children know which form-function pairings are possible in human-language grammars and which are not, regardless of exposure?
- How can emergentists deal with cases of instantaneous learning, or knowledge that comes about in the absence of exposure (i.e., a frequency of zero) including knowledge of what is not possible ? (Eubank and Gregg, 2002: 238)
How does Thornbury argue the case for emergentism?
Nowhere does he give any well-argued reply to the problems alluded to above; nowhere does he recognise the huge differences between the works of Nick Ellis and Larsen-Freeman, let alone discuss O’Grady; and nowhere does he give any coherent account of his own emergentist theory.
That’s what he doesn’t do, but what does he say? He says things like this:
The child’s brain is mightily disposed to mine the input. A little stimulus goes a long way, especially when the child is so feverishly in need of both communicating and becoming socialized. General learning processes explain the rest.
If we generalize the findings beyond the single word level to constructions and then generalize from constructions to grammar, then hey presto, the grammar emerges on the back of the frequent constructions.
(This is a paraphrase of a previous post.)
In an article he wrote in 2009 for English Teaching Professional called Slow Release Grammar Thornbury says:
- emergence improves on Darwin as an explanation of natural development
- it explains language, language learning, and the failure of classroom-based adult ELT
- emergence is also the key to successful syllabus design.
This is his argument:
Emergence is everywhere in nature, where a system is said to have emergent properties when it displays complexity at a global level that is not specified at a local level. There are millions of such systems; the capacity of an ant colony to react in unison to a threat is an example. Because there is no “central executive” determining the emergent organisation of the system, the patterns and regularities which result have been characterised as “order for free”. Pure Larsen-Freeman.
Language exhibits emergent properties.
There are 2 processes by which language “grows and organises itself”.
The first is our capacity to detect and remember frequently-occurring sequences in the sensory data we are exposed to. In language terms, these sequences typically take the form of chunks (AKA formulaic expressions or lexical phrases).
The second is our capacity to unpack the regularities within these chunks, and to use these patterns as templates for the later development of a more systematic grammar.
It is as if the chunks – memorised initially as unanalysed wholes – slowly release their internal structure like slow-release pain-killers release aspirin. Language emerges as “grammar for free”.
Thirdly, there is emergence in learning. Hoey notes how particular words and chunks re-occur in the same patterns. These can be seen in collocations, such as good morning; good clean fun; on a good day …; fixed phrases, such as one good turn deserves another, the good, the bad and the ugly; and colligations, as in it’s no good + -ing.
Hoey argues that, through repeated use and association, words are ‘primed’ to occur in predictable combinations and contexts. The accumulation of lexical priming creates semantic associations and colligations which, in Hoey’s words, nest and combine and give rise to an incomplete, inconsistent and leaky, but nevertheless workable, grammatical system.
Fourthly, the problems which adults have remembering and unpacking formulaic chunks don’t find their solution in most ELT classrooms where few opportunities for real communication are offered.
Wray says: Classroom learners are rarely aiming to communicate a genuine message…, so there is no drive to use formulaic sequences for manipulative purposes.
Even when adult learners do internalise formulaic chunks, they are often incapable of unpacking the grammar, perhaps because many chunks are not really grammatical (expressions like if I were you; you’d better not; by and large; come what may, etc, yield little or no generalisable grammar) and perhaps because they fail to notice the form.
Finally, we can put emergence into the classroom through the syllabus.
If the productive potential of formulaic language is to be optimised, at least four conditions need to prevail:
- Exposure – to a rich diet of formulaic language
- Focus on form – to promote noticing and pattern extraction
- A positive social dynamic – to encourage pragmatic and interpersonal language use
- Opportunities for use – to increase automaticity, and to stimulate storage in long-term memory, and recall.
If we examine the above, we note that Thornbury starts with Stuart Kauffman’s claim that the phenomenon whereby certain natural systems display complexity at a global level that is not specified at a local level is evidence of emergence and “order for free”. This highly-controversial view is then used in an attempt to add credibility to the suggestion that lexical chunks provide “grammar for free”.
Thornbury tells us that many formulaic chunks yield little or no generalisable grammar, which surely must impede their wonderous ability to slowly release their internal structure like slow-release pain-killers release aspirin. Or does their magic extend to releasing qualities which they don’t possess?
Thornbury gives an inadequate and mangled account of emergentism which, according to him, says that lexical phrases explain English grammar, how children learn English and why adults have difficulties learning English as a foreign language.
Thornbury’s unqualified assertion that language learning can be explained as the detection and memorisation of frequently-occurring sequences in the sensory data we are exposed to is probably wrong and certainly not the whole story.
At the very least, Thornbury should give a more measured description and discussion of emergentist views of language learning and acknowledge that it faces severe challenges as a theory.
Last, and maybe least, we get Thornbury’s depressing picture of the arid desert which is the standard adult EFL classroom followed by the triumphant portrayal of an emergentist syllabus, where the “productive potential” of formulaic language is unleashed.
The illusive, definitive recipe of language learning has been revealed: lashings of formulaic language, sprinkled with a little focus on form, served on a bed of positive social dynamic, with the chance of asking for more.
In the likely event that the positive social dynamic gets out of hand in these joyous classrooms, and the adult students start running amok, babbling formulaic chunks of colloquial language at each other, I recommend that the teacher gives out copies of that most calming, not to say soporific, textbook Natural Grammar.
Thornbury claims to be a bridge between researchers working on SLA and practicing teachers. The justification for the claim is that yes he is. The question remains: do we need better bridges?
Eubank, L. and Gregg, K. R. (2002) News Flash – Hume Still Dead. Studies in Second Language Acquisition, 24, 2, 237-248.
Gregg, K.R. (2003) The state of emergentism in second language acquisition. Second Language Research, 19, 95.