Bewildered in Barcelona

Dystopian literature can be fun and sometimes informative to read, but thinly-disguised predictions of a glorious future to come can also be quite enjoyable. H.G. Wells’ The Shape of Things to Come, which I have been delving into recently, belongs in the latter category.

In this work as in his other future-themed books, Wells was wrong about almost everything, but in this he is no different from most “futurists” of all eras. And the Fabian socialist daydreams that pervade the book were not just popular, but practically obligatory among the intellectual class in the period when the book was written. The Shape of Things to Come makes an excellent companion piece to the dystopian visions of Aldous Huxley and Evgeny Zamyatin, two writers who (I would argue) understood human nature much better than Wells.

Not surprisingly, Wells’ comments on education (or those of his fictional diarist, Dr. Raven) caught my eye in particular. Like many of those before him and innumerable education academics after him, Wells all but explicitly states that the education of his time, with its apparent focus on God and country, needed revolution rather than evolution. The following passage gave me a particular chuckle, given all the current “21st-century learning” clichés:

[Faber] sweeps aside almost contemptuously the claim that the nineteenth century was an educational century. We are misled, he argues, by a mere resemblance between the schools and universities of the past and the schools…of the modern period…our education is an introduction to the continual revolutionary advance of life. But education before the twenty-first century was essentially a conservative process. It was so rigorously and completely traditional that its extensive disorganization was an inevitable preliminary to the foundation of a new world.

Sound familiar to anyone?

The central plank of the new, scientific scheme of education envisaged by Wells’ various mouthpieces is a new “permanent system of ordered knowledge”, incomparably superior to any such compendium of the past:

What people knew in those days they knew in the most haphazard way…there was no such thing as a Centre of Knowledge in the world. It is remarkable to note how long mankind was able to carry on without any knowledge organization whatever…nor was there any conception of the need of a permanent system of ordered knowledge, continually revised until the twentieth century was nearing its end. To the people of the Age of Frustration our interlocking research, digest, discussion, verification, notification and informative organizations, our Fundamental Knowledge System, that is, with its special stations everywhere, its regional bureaus, its central city at Barcelona, its seventeen million active workers and its five million correspondents and reserve enquirers, would have seemed incredibly vast.

This is one of Wells’ predictions which was actually, in a way, broadly accurate. We have our Fundamental Knowledge System – it is called the internet. Google and Wikipedia are our Barcelona. And dreamy-eyed Ed visionaries have, for the past quarter-century at least, been predicting a new golden era of knowledge and understanding now that children have instant, miraculous access to the vast compendium of human knowledge. (I vividly remember one particularly silly academic rhapsodizing in 1996 about the great levelling of knowledge that would be occasioned by the rise of the internet. It was an unforgettable piece of self-parody.)

Now that “kids can Google it“, we can concentrate on more elevated aims than the transmission of mere disconnected facts. Or, erm, can we?

I think that if you were to ask teachers who have been in the profession for a couple of decades whether the general knowledge of students had increased or declined over the course of their career – a period which covers the increasing ubiquity of the internet and the deification of Google – the vast majority would say that it had declined.

Anecdotes may be odious, but I would like to share one which had a particular effect on me.

A couple of years ago, two of the students on my Year 12 class showed me a quiz that they had found online, in which a world map was provided with national borderlines included but country names blanked out. The aim was to put the names of fifteen countries (most of them fairly “well-known” ones) in the right places on the map.

These were two very bright students at a selective government high school. How had they done on the quiz? They had not been able to place a single country correctly. Not one.

It is not just in the area of geography; regular conversations with students have suggested to me that students are less conversant with basic historical, scientific and literary knowledge than at any time in recent memory. And every teacher of my acquaintance, when I have asked them about it, has concurred with this conclusion. Again, this is only anecdotal. But when the anecdotes accumulate beyond a certain point

And yet this is the period in which students have their wonderful Barcelona to refer to. But the question is, of course, what use do they make of it?

It would not, in fact, be all that surprising if the “kids can Google it” attitude has gone hand-in-hand with a decline in general knowledge. If the assumption is simply that children can fill in the gaps in their basic knowledge by themselves, and internalize it by themselves, thus freeing the classroom for more vaguely-defined activities, teachers new to the profession are far less likely to ensure that children are equipped with such knowledge; let alone to consider “drilling” children in it. A science teacher at a high-performing school in the UK recently attracted some predictable, if belated, Twitter abuse for advocating the use of drills.

Furthermore, and here we reach the real point of this post, the mooted changes to the Australian curriculum – with much content knowledge apparently set to be ditched in favour of “deep understandings” and the like – are indicative of the “kids can Google it” attitude, which I am tempted to dub “Schleicherism”. I am all in favour of simplifying the flabby mess that is the Australian curriculum, but this should be done by stripping away the vague generalisations, not the important domain knowledge.

Yes, we have our Barcelona. But, to extend the metaphor slightly, it is up to the teachers to provide the road map.

 

A Critical Critique

Andrew Doyle, the British writer and comedian responsible for the hilarious Titania McGrath spoof, has written a thoughtful and engaging piece for the Standpoint magazine dealing with the problem of tribalism and an unwillingness to think beyond one’s convictions. Although I completely agree with his appraisal of contemporary political discourse, I think he overstates the capacity of instruction in “critical thinking” to solve the problem.

Mr. Doyle sensibly suggests that the recognition of “critical thinking” as a discrete academic discipline would be unwise. In this, he is following the work of the redoubtable Professor Daniel Willingham, whose recent researchED talk stressed at all times the importance of domain knowledge. But Mr. Doyle is still optimistic about the transformative possibilities inherent in “critical thinking” instruction:

Pupils learn about common fallacies such as the ad hominem (personal attack), tu quoque (counter attack) and post hoc ergo propter hoc (mistaking correlation for causality), along with others derived from Aristotle’s Sophistical Refutations. The Latin may be off-putting, but in truth these are simple ideas which are readily grasped. If one were to discount arguments in which these fallacies were committed, virtually all online disputes would disappear.

Unfortunately I don’t think this is the case at all, and it is worth examining why.

If the problem with online disputes was simply poor argumentation, this thesis might be right. But instead, what one frequently comes across online these days is the spectacle of two or more people beating each other over the head with contradictory studies. And the real sign of the times is that this is occurring much more frequently in areas which belong in the political and social domain just as much as the “scientific”.

The tactic of using an academic paper, full of either abstruse technical language or impenetrable jargon, to support one’s point is a very useful one. Partly, of course, because picking holes in detailed academic arguments requires far more than a simple familiarity with logical and rhetorical fallacies. But partly also because the very word “science” carries such a magical aura these days that expressing doubt in the face of it amounts to virtual heresy, if one is to keep one’s intellectual credentials intact. (Incidentally, if anyone uses the dreaded definite article before the word “science”, you can be sure that they are not interested in argument at all; they are interested only in bludgeoning people into submission.)

Invoking (the) science is intended to take the “discussion” beyond the realm of logic and argument – even when a moment’s reflection would confirm that the normal application of human reason plays an important role in the matter at hand. Its purpose is to render utterly redundant the sort of critical thinking which Mr. Doyle rightly values very highly.

Two areas in which this can be seen every day in online debates are the ongoing climate change squabbles, and the bickering over the ongoing Covid-19 lockdowns. It should hardly need pointing out that public policy in these areas is a political and social question as much as a scientific one. It should also be plain to most thinking people that predictions and mathematical modelling in these areas should not be afforded the same reverence as empirical observation. But these basic considerations simply get lost in the noise when the word “science” is invoked.

So we come back to the role of education. What can teachers, and education systems more broadly, do to improve the lamentable standard of debates between supposedly civilised and mature individuals?

Stand-alone courses in “critical thinking” won’t do the job. I have seen a few of these myself, and beyond offering a few pointers about the sorts of logical fallacies that Mr. Doyle mentions, they all (ironically, and shamefully) appeared designed to steer students directly towards the flaccid, unassailable orthodoxies of our time.

All we can really do as teachers, I feel, is to set a good personal example. Ensure that our own use of logic is sound and fearless. Answer questions without either evasion or empty appeals to authority. Accept alternative points of view offered by students with equanimity and interest. Gently suggest that “science” does not offer the solution to every problem or disagreement, particularly the sort of “science” which is barely worthy of the name. And most important of all, call out the sort of intellectual and moral bullying which infects so much of online discourse robustly and effectively.

 

 

Stress and Pause

During my teaching stint in China (see here), one of my colleagues was a tall, genial Canadian teacher and linguist who was preoccupied to the point of obsession with the importance of “stress and pause” in improving the speaking skills of EFL students. Although this became something of an in-joke among the teachers eventually – our Chinese supervisor wittily wrote in one of his regular bulletins that he “would like to pause to stress the importance” of one of the announcements – it is an important concept; choosing the right syllable to stress in an English word, and appending a suitable pause afterwards, is often just as important for listener comprehension as pronouncing vowels and consonants correctly.

Although this is an issue of prosody rather than pedagogical technique, the principle of stress and pause can also be applied to teaching. This was brought home to me quite tellingly in the past couple of days.

Like most teachers, I have been making the best of a less-than-ideal situation in recent weeks. Although I have delivered a few video lessons, much of my teaching has been by means of old-fashioned text, buffered by activities and quizzes of various kinds. One of my classes is a beginner Year 7 class, and it is here that my pedagogical lesson on stress and pause arose.

I noticed in the students’ answers to an introductory exercise that they were struggling with a basic, fundamental concept that is usually done and dusted after the first couple of lessons. Wondering why it had failed to sink in this time, despite what I thought was a nicely clear explanation, I tried to remember the manner in which I usually present it in class. My lightbulb moment came on realising that usually, during the explanation I really emphasise the key words – the stress – and leave a good few seconds before moving on to the checking-for-understanding phase – the pause.

This is subconscious by now, but this morning, on explaining a much more advanced concept to a senior class, I found myself doing it again. Key words – stressed. After the presentation of the concept – a notable pause.

I think this is probably a technique that most teachers pick up “along the way”, but the question arises, how can it be incorporated in textual explanations? Bold type, italics, repetitions and capital letters can replicate the stress to some extent (although they can also make the writer look ridiculous at times), but what about the pause?

In my view, we should consider including in the text little admonitions like “please read through the previous sentence again, a couple of times if necessary, to make sure you understand”. This may seem trivial, but it is about as close to replicating the teacher’s instinctive pause as is possible in a textual explanation. After all, when faced with a wad of text, it is often hard for students to discern which parts are really worth, well, pausing over.

The Double Front

In a post from last year, I mentioned that a former student of mine was finding that Henri Giroux was the iconic figure in his undergraduate education course. For me, it was Paulo Freire. I still remember the languid, condescending voice of the lecturer who all but deified him. For others, it may have been the familiar double-act of Piaget and Vygotsky. One thing is for certain though: it is unlikely to have been Wilhelm von Humboldt, Matthew Arnold or Hannah Arendt.

The only types of educational thinkers that seem to matter in the modern academy are the constructivists and/or the social-justice crusaders. The implicit message passed on to undergraduates is that no-one thought seriously about education prior to these luminaries, or that if they did, their ideas have long since been superseded. (Again, I am relying on over twenty years’ worth of conversations with trainee teachers for this conclusion.)

Recently an education academic posted a tweet which encapsulated this attitude magnificently:

Hmm. Well, there are, of course, other writers on education, with different viewpoints, who are worthy of teachers’ attention. But you certainly wouldn’t know it from most initial teacher education courses.

It often feels as if the humanists (or “traditionalists”, if you must) among us are fighting a philosophical battle on two fronts.

The constructivists have somehow mutated into the current Robinsonite tendency, where “personalized learning” is all the rage and phrases such as “entrepreneurial thinking” make cringe-inducing appearances on trendy websites.

You would expect the social-justice brigade to be fundamentally opposed to this development, and in fairness, some are. But every time one of the no-best-way-overall-except-mine regulars takes the latest cheap shot at teachers who favour content knowledge over fads, one of the Freirites is often lurking in the background, happy to add their/zeir support to the criticism.

The researchED movement has been particularly vulnerable to this sort of two-pronged assault, especially given the propensity of the Freirites to impute the most malign intentions to the most fair-minded individuals, or to feverishly examine people’s “connections” in order to identify ideological impurities.

This became clear during the farcical episode of a list of edutweeters provided by a UK history teacher, and most recently in the positively surreal case of the much-admired blogger Ben Newmark and his video lesson plan. The Freirites and their ilk are not noted for either moderation or breadth of moral imagination.

All of this is by way of explaining why reserachED and similar grassroots movements have usually had to bypass tertiary education faculties and appeal directly to ordinary teachers. Next time you hear an education academic complaining that they have been ignored or cruelly sidelined by one of these informal groups, you know exactly why.

The EdTech Sceptics

The always ebullient Tom Bennett caused something of an edutwitter stir yesterday with a tweet expressing sentiments with which, I would argue, the majority of teachers (not to mention children and their parents) would agree at this point. He was subjected, predictably, to a good deal of vitriol by people who hadn’t bothered to understand what he was saying.

Fortunately, others were on hand to clarify matters. But there is a point about the current EdTech debate which is indicative of broader trends within the education world.

To put my own cards on the table: I am grateful for the fact that technology has allowed me, for the past few weeks, to teach my classes remotely and deliver the curriculum in a relatively, if not completely, seamless manner. But I am also aware that this manner of teaching comes an extremely poor second to the face-to-face environment of the classroom, for all sorts of reasons.

Those who are expressing scepticism about the potential of education technology are not demeaning the efforts of individual teachers (and school executive) to adapt to these extraordinary circumstances. They are not deriding the achievements of those who have created the software and apps that have allowed teachers and their students to achieve some measure of normality in the Covid-19 world. What they are doing is pouring some cold water on the giddy, juvenile calls for a complete revolution in the structure of education at primary and secondary level.

To say that the experience of the last month gives few grounds for enthusiasm about such a revolution would be a severe understatement.

But here is the interesting thing.

It is not, on the whole, the tech startup types who have been leading the calls for an “education revolution”. Far less is it the individual teachers. Instead, of course, it is the usual gang of edu-gurus who are so gleefully seizing the “opportunity” of a global pandemic to push their garbled ideology to the forefront.

Whether it is Professor Yong Zhao (“literacy is less important now because of text-to-audio translators”), David Price (“any teacher who can be replaced by a YouTube video should be”), or Andreas Schleicher (take your pick of a hundred gormless quotes from over the years), it is the hardened progressivists of academia, consultancy and the NGO gravy train who are calling for EdTech to be the new reality rather than a useful support and fallback.

It is they who are childishly insisting that “we can’t possibly go back to the way it was before!”.

And it is they, not the boffins of the EdTech world, who are revealing just how pathetically out of touch they are with the everyday experiences of teachers, parents and children.

The Trojan Horse

“Never let a good crisis go to waste” is an admonition which is frequently, and perhaps incorrectly, attributed to Winston Churchill. Whoever originally said (or wrote) it has found a very willing audience among the ranks of progressivist “educators”. As a matter of fact, given the barely-disguised glee with which they have been banging their respective drums in recent weeks, a cynic might be tempted to conclude that they are relishing the opportunities provided by this grim, unprecedented period in modern human history.

We have had the predictable calls to scrap the ATAR as a means of determining entry to Australian universities; not just for 2020, but for good. We have had the wide-eyed encouragement to step up the implementation of “online learning”, as if its obligatory implementation so far has been anything other than a stressful muddle. And now, the icing has been put on the cake by one of the high priests of the progressivists, the egregious Professor Yong Zhao (you remember, the “literacy is less important now because of text-to-audio translators” man).

I suppose you have to give Professor Zhao full marks for chutzpah. To transmute a period of immense frustration, uncertainty and anxiety for students into “an opportunity for reimagining education” takes, well, a certain impish self-confidence.

So what exactly is the good Professor suggesting? Naturally, it’s the usual impenetrable forest of abstract nouns:

…global interdependence…interconnectedness…competencies…authentic learning experiences…

These things are apparently needed now more than ever, apparently, because of, you know, “xenophobia, racism, nationalism” and all the other Bad Things.

All of this is, of course, boilerplate. But what I find much more interesting and revealing are Prof. Zhao’s justifications for “reimagining education”.

First, Covid19 has forced the cancellation of many high stakes examinations students have been subject to, at least temporarily removing the pressure to teach to the test.

Note one thing in particular here: the huge elision between teaching a subject for which there exists an external examination and “teaching to the test”, as if all that teachers did (even prior to matriculation year) was drill students in the types of questions used in an external examination.

This outlook dovetails closely with the “test-taking-as-a-skill” delusion, about which I will probably have more to say in the future.

Third, governments and accrediting bodies cannot reasonably expect schools to comply with their prescribed curriculum during the crisis.

Again, a staggering elision. Are schools now exempted from doing their best to continue teaching five-year-olds to read? Governments can, and have every right to, expect that schools will do their utmost to deliver the prescribed curriculum. This would be expected by parents and the community at large, and is actually far easier on teachers (not to mention students) than taking some ill-considered leap into the unknown.

Fourth, online education is not conducive to deliver (sic) high quality instruction of some traditionally valued subjects.

The fact that this is being adduced as a reason to make wholesale and permanent changes to the curriculum surely speaks for itself in its fatuity. I will spare you the dozens of obvious, and damning, analogies. And the fact that such a line of reasoning is being used by a professor of education is…breathtaking.

Sixth, during this crisis, parents and the public are more concerned about the physical safety as well as social and emotional wellbeing than academic content, so should educators.

Never mind the elisions, the professor does a good line in false dichotomies as well. And there is, of course, an obvious objection here: which is the more likely to foster “emotional wellbeing” during this period – work with clear instructions which draws on existing knowledge and skills and/or builds on them incrementally, or ill-defined, open-ended tasks replete with the abstract nouns which Prof. Zhao is so keen on? I think most teachers and students (not to mention parents) would opt for the former.

In my view, the most important thing for anyone to keep in mind during this miserable period is: this is temporary. It may last for a considerable time, but eventually we will return to something approaching normal life…with all its comfort and familiarity. With this in mind, I think it is particularly important that educational bodies are particularly wary of making the sort of sweeping changes being demanded by those who are using Covid-19 as a Trojan horse. Even leaving aside the likely effects on teachers and students, these avid proposals for change currently appear to be based on little more than wishful thinking, airy futurism, and in many cases sly opportunism as well.

Different, But Same

“You look good together,” says Pat Morita’s Mr. Miyagi character in the original Karate Kid film, when he sees a photo of his protégé Daniel with his girlfriend. “Different, but same.” Daniel, in typical Hollywood style, has had the customary prior-to-the-denouement disagreement with his beloved. “No,” he responds glumly, “Different but different.”

There is a certain type of “educator”, usually a consultant or pressure-group loudmouth, who tends to get quite hot under the collar when anyone suggests that children are “different, but same” in the way that they learn. Recently, another Daniel, the respected cognitive scientist Dan Willingham, caused some consternation among edutwitter’s self-appointed children’s champions with this tweet. Although he has since expressed regret that he put things so starkly, there is no fundamental reason to disagree with his assertions. The problem is that most of those who clutched for their pearls over the tweet appeared to read into it implications that weren’t really there.

So let me clarify what those of us who agree with Prof. Willingham on this matter do not believe. (I realise I am “speaking for myself” here, but others who agree on this point probably hold similar views…please let me know if not.)

We do not believe that all children will have exactly the same aptitude for learning. Environment, parental involvement, and (dare it be said) heredity will all play a role.

We do not believe that children will never learn things (including fundamental subject content) for themselves, without the agency of a teacher. Of course some will.

We do not believe that every child will approach new content with exactly the same on-the-spot capacity to master it. Background knowledge, emotional state and all the other factors will differ from child to child.

We do not believe that the blanket term “learning” excludes all the social and emotional growth that happens both in and out of school during a child’s years of schooling.

But the phrase “every child learns differently” is misleading at the very best. Far more importantly, it leaves the door wide open to the various constructivist canards which have done so much damage to educational standards in the Anglosphere over the past half-century. Learning styles may at last be on the way out (not before time), but project-based learning and associated fads are still all the rage, to the delight of consultants and conference blowhards and the despair of parents who care about their children’s education.

Children are all different, and teachers value and celebrate that difference. But when it comes to the mechanics of memory and factual or procedural knowledge, I’ll trust a leading cognitive scientist over a few cliché-wielding consultants any day of the week.

 

 

Phonology First, Part II

Some other matters arising from Prof. Jeffrey Bowers’ explanatory article on the SWI approach.

For example, consider the <w> in the spelling <two>. This <w> is has no phonological role — therefore it is not a grapheme.

In my view, one of the major problems with the SWI approach is its, let us say, unconventional use of proper linguistic terminology.

The above conception of a grapheme, needless to say, is not in line with the conventional definition. Elsewhere in the SWI corpus, the letter o in autograph is described as a separate morpheme, which is again very much at odds with the accepted definition. “Fox” is considered to consist of three (not four) phonemes, which would be somewhat baffling to anyone familiar with the concept of minimal pairs (not to mention anyone who happens to notice that “rocks” is pronounced similarly).

If the idea is simply to redefine linguistic terminology in the interests of this new approach to orthography, one could say, who cares. But it is always worth keeping in mind that this approach is being pitched to young teachers, who may know little of linguistics; if they subsequently encounter the usual linguistic terms in their more familiar contexts, confusion is surely quite likely to ensue.

There are frequent hints (though not outright declarations) within the article that a phonics-based approach ignores, rather than defers until a developmentally appropriate stage, issues of morphology and etymology:

However, unlike phonics, SWI considers grapheme-phonemes within the context of morphology and etymology…

…phonics instruction that explicitly teaches children grapheme-phoneme correspondences in English without reference to morphology and etymology…

It would have been fair of Prof. Bowers to note that no serious proponent of phonics instruction, not one, decries the value of morphology and etymology at a later stage, or claims that familiarity with grapheme-phoneme correspondences alone is sufficient to become a competent reader and writer of English, with its deep orthography.

By leaving such statements simply hanging there, he gives tacit encouragement to the many people who regularly, and dishonestly, claim that advocates of phonics believe that reading instruction begins and ends with phonics.

Lastly, a tangential but important issue:

There is an overwhelming consensus in the research community that systematic phonics is best practice for early reading instruction in English.

This is undoubtedly true, but it is not the whole story. Those actually involved in proper research into early literacy have indeed consistently confirmed what common sense would already suggest, namely that thoroughly familiarising children with letter-sound correspondences initially is the most effective approach. But it is not in research publications that the battle for influence over the hearts and minds of trainee teachers is really fought. It is in the lecture theatres of initial teacher education courses.

What twenty years of interactions with trainee and first-year-out teachers has shown me is that attitudes to proper phonics teaching among ITE lecturers are almost uniformly negative, whatever the accumulated research may suggest. Phonics is simply lumped in with the other “traditional” practices and attitudes, and trainee teachers are implicitly encouraged to react from the gut in such matters, not from the evidence.

The anti-phonics zealots of the academy are also given plenty of editorial space in the “popular-academic” press, and a perusal of websites like TES or Conversation would reveal a preponderance of articles critical of phonics or phonics-related policies (such as the UK screening check). It is very much worth noting here that Prof. Bowers’ recent piece casting doubt on the research evidence for phonics was leapt upon by all the usual suspects, many of whom, incidentally, promote alternatives to phonics which are very different in approach and philosophy from SWI.

Morphology and etymology are fascinating, and very important. But they have their place, and it is not at the very beginning of reading instruction.

Phonology First, Part I

The very first essay I wrote in my undergraduate linguistics course was a defence of the English spelling system. My argument – inasmuch as my callow 18-year-old self was able to construct one – was that, given the unsuitability of the Roman alphabet to the English phonological system, not to mention the varied and often overlapping influences on the English language, our ramshackle orthography was not a bad compromise. (Those who like to draw social parallels could point to the trial-and-error accretion of English common law, or the outwardly bizarre English system of weights, measures and currency.)

Morphology and etymology played an important role in the essay. I wrote about sign and signal, and how the morphemic identity outweighed the phonological discrepancies. About debt and debit. Receipt and reception. And I wrote, too, about the need to preserve the orthographic form of a morpheme in the wake of the shifting schwa vowels occasioned by the stress patterns of English (think informant versus information, or photograph versus photography).

Nearly three decades later, my view hasn’t changed. I still think that, all things considered, the English spelling system is not nearly as bad or as capricious as it is sometimes depicted, and I am frankly pretty dismissive of the monomaniacs (such as a regular commenter on the Conversation website) who advocate a wholesale, Bernard Shaw-esque reform of English orthography.

Given all this, you would expect me to be supportive of the Structured Word Inquiry approach. And in some ways, I am. It is important for young people learning English to get to know something about etymology and morphology, even if they do not become familiar with those exact terms. Although much of the online instructional material associated with the group is full of etymological and other errors, the basic idea behind it – of dividing a word into its etymological or morphemic units – is sound enough. There are certainly aspects of SWI that appeal to me, although the online behaviour of their advocates is not among them.

But.

First of all, to go to the opposite extreme of the Bernard Shaws and claim that English orthography is entirely rule-governed and bereft of exceptions is futile; still less is it true that there is strict regularity in the adoption of Greek and Latin roots and morphemes into English. Recede and precede come from compounds of the Latin verb cedere, but so do succeed and proceed. The almost identical Latin verbal adjectives nobilis and mobilis give us noble, but mobile. (Yet, of course, nobility and mobility.) The Greek verbal noun suffix -ma looks to have entered English regularly enough if we consider schematic, idiomatic and dramatic, but not when we consider the base forms scheme, idiom and drama. I should add, in passing and in fairness, that not all SWI advocates adopt such an absolutist approach to English orthography.

Secondly, and far more importantly, there is a basic problem (another -ma word, folks!) at the heart of the SWI philosophy, one that it shares with many other approaches which either dismiss or relegate the need for children to obtain an initial grounding in phonics. In an article by Professor Jeffrey Bowers, in which he outlines the SWI approach, we find the following:

English prioritizes the consistent spelling of morphemes over the consistent spellings of phonemes…a language that prioritizes the consistent spelling of morphemes over phonemes is not “fundamentally alphabetic”.

The problem with this plausible contention is that like is not being compared with like. Morphemes are not unitary in the way that phonemes are: indeed, they are made up of one or (usually) more phonemes, in a specific pattern. And the orthography of the basic morpheme is, of course, determined by the phonology: it is not arbitrary.

The clearest indication of this comes, in fact, with new additions to the language. Foreign words, onomatopoeic words, and borrowings from slang are all initially adopted according to phonology (it could hardly be otherwise, since they will constitute a morpheme that doesn’t exist yet in the language). They may acquire –ed, –s, –ing and others along the line, and morphophonemic changes may occur. But it is, of course, phonology which determines the spelling of the new word. A can hardly be more “fundamental” than B if it depends on B for its component parts.

There is a good reason why, when field linguists produce a grammar of a language, they traditionally deal thoroughly with the phonology before moving to matters of morphology and syntax. It is simply the systematic way to proceed: deal with the building blocks first, then move on to the more exciting stuff. Mutatis mutandis, the same principle holds with initial literacy instruction, and for the same reasons.

More on Prof. Bowers’ article in Part II.

Future Schlock

It must take a special kind of chutzpah to double down on your own woolly education philosophy when data from your own organization consistently belies it. But Andreas Schleicher has never had a problem with this.

Despite the PISA results (for what they are worth) repeatedly suggesting, even on cursory analysis, that education systems which favour explicit direct instruction and “traditional” methods achieve superior results, Herr Schleicher is sticking to his guns, assuring us that in the future:

The next generation of young citizens will create jobs, not seek them, and collaborate to advance a complex world. That will require imagination, empathy, resilience, and entrepreneurship….a world that requires constant adaptation from learners…

It is becoming an unassailable truism in education that whenever a policy wonk or edu-guru lays claim to clairvoyance in this manner, they are doing so to distract attention from the fact that in the present, their preferred policies are failing miserably. But all will be different in the future, you see.

The other distinct red flag in the passage above is the mention of entrepreneurship. Herr Schleicher is a leading light of the worst-of-both-worlds tendency in education (see this blog passim), in which fluffy progressive posturing is allied with the Silicon Valley brand of stiletto capitalism. Philistinism is a core value of this particular educational philosophy:

There is a lot that government and society can do to help learners adapt. The easiest is telling young people more of the truth about the relevance of their learning, and to incentivise educational institutions to pay more attention, too.

Buckets are available on request.

Needless to say, self-satisfied CEO types relish all this “relevance” spiel, abetted frequently by politicians of a certain stripe. Some particularly inane Twitter comments by one of these recently led the redoubtable Carl Hendrick to re-post his outstanding blog piece on the mistaken “preparation for the workforce” attitude to education.

Let’s just look at one of the assumptions underlying Herr Schleicher’s waffle. What indications are there that the young people of tomorrow will “create jobs, not seek them”? What straws in the wind suggest that they will all need to be entrepreneurs in a few years’ time?

One would expect there to be credible evidence that small business startups are becoming more successful by the year. Yet all the (admittedly little) information I have been able to gather would suggest that failure rates for startups are either steady or increasing (there are some interesting graphs on this page).

If prognostications are based on observable trends, then those who purport to know the future can speak with a little confidence, tinged with a lot of caution. But without such trends to inform their commentary, they are simply groping in the dark like the rest of us.