Writing skills and grammar teaching: the misinterpreted study of Englicious

A teacher running an interactive grammar exercise (still from Englicious in the Classroom video)

A recent study, as you may or may not have heard, has found that teaching grammar to Year 2 children (age 6-7) does not improve their writing. But that’s not what it found.

The study, by researchers at UCL and the University of York, did not compare grammar teaching with no grammar teaching. It compared one particular programme of grammar teaching, called Englicious, with the grammar teaching that schools were doing already, and it found that Englicious produced results that were essentially no better than the other grammar teaching.

The research paper makes this very clear, although the conclusions section seems to stray a little into over-generalisation. And the news articles published by the two universities both lead with the generalised claim:

The teaching of grammar in primary schools in England (a key feature of England’s national curriculum) does not appear to help children’s narrative writing, although it may help them generate sentences, according to new UCL-led research.

UCL

Lessons on grammar are a key feature of the national curriculum taught in England’s primary schools, but they don’t appear to help children to learn to write, new research reveals.

York

This is the angle the media coverage has largely followed. (Note to journalists: always read the PDF.)

The background

Grammar teaching, which has become a bigger part of England’s national curriculum since 2014, is contentious. Some see it as providing the essential building blocks of literacy and good communication, while others think it a bewildering morass of jargon that has little relevance to how people really speak and write.

I broadly support grammar teaching in principle, although I don’t have strong views on how much to teach at particular ages and I can’t claim any expertise on teaching methods. But I have liked the idea of Englicious since I first heard about it.

Englicious is a set of resources to help teachers run classes on grammar in line with the curriculum. It aims to address two of the big worries about grammar teaching: that it’s dull and off-putting, and that the theoretical knowledge of grammar is disconnected from children’s actual writing skills. Its approach is strong on interactive exercises, trying to incorporate some fun, and it links grammatical concepts with practical writing work. This sounds to me like a great idea.

The study

To test it, the researchers recruited primary-school teachers and put them into two groups: one group used Englicious with their Year 2 classes, after an introduction to the resources and training in how to use them; and the control group taught grammar using the various approaches they were already using (this is the crucial detail that most of the coverage has obscured).

The research paper highlights one distinctive feature of the Englicious approach:

One key difference between intervention and control schools… was that the Englicious lessons consistently included an opportunity for pupils to apply their learning through an independent writing activity that was part of the Englicious lesson. It appeared that this was not a typical approach in every lesson observed in the control schools. In the control schools a wide range of teaching strategies was seen being used to support learning about grammar, for example general approaches to grammar teaching that included using a text to contextualise teaching of grammatical terms and their properties; teacher-led strategies including deliberate inclusion of errors when presenting texts; whole-class activities including discussions while pupils were sitting on the carpet; and use of mini whiteboards for pupils to write sentences…

Because Englicious was designed to link grammar to writing, the main way the researchers assessed its effect was through a narrative writing test, in which pupils had to create a narrative based on a prompt. They also used a sentence generation test, in which pupils were given two words as a prompt and had to generate sentences using them both – a task more focused on grammatical understanding.

The findings

Looking at the test scores from before and after a ten-week period of teaching, the study found that Englicious had no effect on the pupils’ narrative writing scores relative to the control group, and that it may possibly have improved sentence generation scores a little, but this difference was not statistically significant (p=0.25).

The researchers gamely describe the second finding as “encouraging”, although I think “disappointing” would be a fairer assessment. Englicious may have been slightly helpful with actual grammar teaching (more research is needed), but it failed in its primary objective.

This doesn’t mean the approach is worthless: it seems to be at least as good as other current methods; it may have other benefits that weren’t covered by the two tests; and it might help pupils’ learning to last for longer than this ten-week study. And the teachers who used Englicious gave largely positive feedback on it in questionnaires, saying that the lessons were a positive experience for both them and their pupils. That counts for something. They also made some suggestions that could improve Englicious.

The confusion

But there’s no escaping the fact that the study didn’t find the main desired effect. And here, perhaps, is where the widespread misunderstanding of the findings was born. The paper says:

The lack of effect on narrative writing is the main outcome of our research, and is consistent with previously published studies on grammar and writing at primary education level.

These older studies, though, didn’t look at Englicious or the post-2014 English curriculum. The general thrust of the previous research, as this paper summarises it, is that grammar teaching, on the whole, shows little if any sign of improving pupils’ writing. To further support that conclusion, you would have to compare grammar teaching (of some kind) with no grammar teaching, not compare different kinds of grammar teaching.

One sentence on the paper’s methodology acknowledges this problem:

The context of England’s national curriculum requirements meant that it was not feasible to have a control group that did not have any grammar teaching, a control that some would regard as a better comparison.

That last bit seems quite an understatement.

In light of this problem, I struggle to see how the following conclusion is a fair reflection of a study that compared different methods of teaching grammar:

The research found that seven-year-old pupils’ narrative writing was not improved as a result of grammar teaching.

I suppose the researchers could say “We gave it a really good shot, but all we’ve got is another way that doesn’t work; this suggests that the whole idea of connecting writing skills to grammar teaching is doomed to failure.” But it’s clear that they think the Englicious approach has room for improvement.

The upshot

I wish the Englicious team the best in further developing their work. Their approach has merit – it’s just not yet clear how much. And I agree that there are broad grounds for concern about how grammar teaching can improve writing skills.

This was a valuable and well-conducted study – the first randomised controlled trial in the world to examine this topic. But I think it has been over-interpreted: in parts of the conclusion and the university press releases, and in media reports that don’t look too closely at the detail.

Grammar teaching is politically fraught. For many people, it taps into notions of authority, discipline and tradition versus liberalism, diversity and modernity. Debates on it can get pretty heated, with people often falling back on ideological preconceptions. So if we want to improve how grammar is taught, we all need to be clear about what the relevant evidence does and doesn’t show.

Bad grammar, bugbears and dæmons

Dafne Keen as Lyra with Pantalaimon, from the BBC adaptation of Philip Pullman’s His Dark Materials.

Susan McDonald, an experienced subeditor at the Guardian, has written an article that appears to be about grammar and usage but is really about everyone’s favourite topic: how annoying other people are.

McDonald doubtless knows more than a thing or two about whipping ungainly sentences into shape. Her daily work involves tweaking punctuation, replacing clichés, shepherding stray verbs towards their subjects, and making all sorts of other small changes that smooth the path from the writer’s brain to the reader’s.

But she says she doesn’t nitpick for the sake of it, instead using common sense to decide when rules can be bent. I agree with that as a broad principle, but the thing about common sense is that it’s never as common as you think. What strikes one person as sensible flexibility will strike another as sloppy inconsistency; one person’s high standards will be another’s restrictive dogmatism.

McDonald gives some examples of things that definitely do matter (to her):

Some of my personal bugbears come up more regularly than others.

“Compared to” is different from “compared with”. Use the first if you want to liken one thing to another, the second to contrast.

And that reminds me: in my book “different from” is correct, “different to” and “different than” are not.

“Who” and “whom” are both needed but not interchangeable. The same goes for less/fewer, like/such as and imply/infer.

As a copyeditor, I think I would be absolutist about only one of these six. For moderately formal pieces, I’d probably apply three or four more of them across the board and the other one or two depending on context.

But I would also usually avoid using ‘regularly’ to mean ‘frequently’, as McDonald does here – so on that point I’m more of a stickler than her. And there are people who would scorn both of us for beginning sentences with ‘and’.

There’s no objective way of telling which ‘rules’ are the ‘correct’ ones. Any of us can talk about what’s right ‘in my book’ – but each of our mental rulebooks is different.

Some people respond to these differences by saying that the safest thing to do is always follow all the rules – that way, you won’t annoy anyone. But a lot of these (alleged) rules are, to put it politely, stupid. Picking every conceivable potential nit would be enormously time-consuming and make prose awkward, stiff – even annoying. McDonald rightly points out that, for instance, split infinitives and sentence-ending prepositions often produce better results.

A lot of these decisions are judgement calls. You have to think about audience, medium and desired effect. You have to keep abreast of how people are using the language and how they expect it to be used. You have to estimate which fine distinctions are too fine to be reliably clear, and you have to have a nose for what might be likely to cause a stink.

You also have to remember that the people who complain about ‘broken rules’ are far louder than those whose reading is eased by a certain breakage – but that doesn’t mean the loud ones are anything like a majority.

Sometimes there’s no right answer. Language isn’t like mathematics; it’s like life.

McDonald describes her linguistic gripes as bugbears, and many people talk semi-fondly of having pet peeves, but really these attitudes are more like Philip Pullman’s dæmons – they’re aspects of ourselves. They are changeable during childhood but become fixed as we grow up. They might cause us annoyance, but they are a dear, cherished part of who we are, and any attempt to separate them from us causes terrible pain.

The last line of McDonald’s piece is:

Language reflects – and can even define – who we are. So a little respect, please, for its rules.

It’s not just language but also our attitudes to language that are part of our identities. But the ‘rules’ we get the most righteously angry about don’t belong to language in itself. They belong to our personal conception of it. And when we meet someone whose internal rules are frustratingly different, we have two options: banish their dæmon or pacify our own.

Neither is easy.

Why can’t the English learn to speak about English?

Rex Harrison and Audrey Hepburn in My Fair Lady

Clare Foges has written a passionate column in the Times about the dangers posed by linguistic prejudice.

She highlights evidence of discrimination against people, especially young people, whose speech doesn’t fit in. Civil servants who lack the “right accent” are less likely to get promoted, even if they do good work; in industries from law to accountancy, those who don’t sound middle-class enough are less likely to be hired, even if they have good grades; and many professionals give lower ratings to answers delivered in certain varieties of English, even when the content of those answers is good.

As Foges says, people who speak dialects other than standard English “are surrounded by invisible barriers to success, yet we as a nation are too squeamish to say anything about it”. I commend her for raising this injustice, and for her directness in saying that “We do young people no favours by pretending that the way we speak doesn’t matter any more, because it does.” Collectively, we do need to get better at talking about our language and the role it plays in society.

The current situation isn’t just bad for those who are looked down on because of the way they talk. It’s bad for all of us. It means that the legal system is needlessly denying itself valuable talent. It means that the accountancy firms that businesses rely on are carelessly tossing out some of the best recruits available. And it means that we, the taxpayers, are getting poor value for money when mediocrities get ahead in the civil service while more capable public servants are overlooked.

How, then, can we combat this economically and socially ruinous linguistic prejudice? Foges has a simple solution: elocution lessons.

Wait, what?

Yes, she wants to give young people “speech coaching”, to endow them with the diction of the middle classes and the grammar of standard English. The best way to deal with prejudice, it seems, is to eliminate its object.

She briefly considers an alternative proposal – that linguistically prejudiced people should “challenge their biases” – but she isn’t interested. Because, you see, she agrees with them that standard English is better than other dialects. Here’s her argument:

Standard English is best because (the clue is in the name) it is the standard, with rules the vast majority understand. It is the medium through which writers and speakers of the language can achieve maximum clarity and minimum confusion. This is why deviating from it can grate. If people speak sloppily, mangling their grammar and failing to enunciate their words properly, language turns from a window between souls into a wall between them — and swiftly, subconsciously, we label the speaker.

In the spirit of diplomacy, I will try to meet Foges partway on this.

I agree it is important for children to learn standard English – but not because it’s better than other dialects or more precise or more expressive. It’s useful to know because it’s widely used in the public sphere, in business, in academia… in the kind of professions traditionally dominated by people who grew up in well-to-do families that speak standard English.

But standard English didn’t achieve its high social status in some merit-based competition, beating other dialects because of its intrinsically superior grammatical conventions and vocabulary. It got where it is because of William the Conqueror.

Before 1066, the capital of England was Winchester, which had previously been part of the Kingdom of Wessex. Back then, regional differences in English were much bigger than they are today, but the West Saxon variety – spoken in Winchester, the seat of power – looked set to become top dog. After the Norman Conquest, though, the capital was moved to London. The locally spoken Mercian became the high-status dialect, and out of it grew what we now call standard English.

While I bear our friends and neighbours in France no ill will for that ancient act of aggression, I cannot fathom why we in England should let a long-dead Frenchman determine which variety of our own language is best.

Standard English is only standard because of an accident of history. And different dialects are not “deviations” from it, they’re not “sloppy” or “mangled” attempts to speak it: they’re just different. What they lack is not clarity but prestige.

Yes, kids should learn standard English, but they should also learn that English is a family of dialects, related to region, class, and more recently ethnicity. All of these dialects change, including standard English; they influence each other and their borders overlap. (Foges complains about the adoption of Multicultural London English by “teenagers in country towns who desperately want to appear cool”, but changing the way you talk to make yourself come across a certain way is exactly the policy she recommends.)

But if we want to reduce the ill-effects of linguistic prejudice, there’s another side of the coin. Those of us who are perfectly at home with standard English should be careful how much importance we attach to whether others speak like us, especially if we’re in positions where we can help others to advance.

I am an editor. It is literally my job to improve other people’s writing – pretty much always in standard English. I fix typos, I substitute words, I tweak grammar, I rejig paragraphs, trying to help my colleagues come across as clearly and effectively as possible. What I absolutely do not do is judge those of them who are less comfortable than me with the niceties of standard English.

I’ve worked with a lot of people who are good at all sorts of things – and some of them not so good – but I’ve never noticed that their skills and commitment have any correlation with their dialect, accent and enunciation.

So this attitude described by Foges is a lamentable mistake:

When someone says “could of” instead of “could have”, or “pacifically” instead of “specifically”, or “froo” instead of “through”, they are labelled. The interviewer labels them “not sharp enough”. The colleague labels them “not up to it”.

It’s not quite clear whether she shares this attitude herself, but it’s damn clear that she’s not going to do anything to oppose it.

Foges ends by quoting Henry Higgins talking to Eliza Doolittle:

Think what you’re dealing with. The majesty and grandeur of the English language, the greatest possession we have. The noblest thoughts that ever flowed through the hearts of men are contained in its extraordinary, imaginative, and musical mixtures of sounds.

But Eliza’s English is no less English than Henry’s. If all our speech were standardised, those mixtures of sounds would be so much less extraordinary, imaginative, and musical.

Fronted adverbials: what the hell is going on with English grammar?

Every so often a kerfuffle erupts about the teaching of grammar in English schools, and the focal point these days is often the ugly term “fronted adverbials”. What on earth are these obscure things, people wonder, and why are young children being forced to learn about them?

Two daunting words, one simple concept

Let’s start with “adverbial”. It looks like an adjective, doesn’t it? “Of or relating to adverbs”, that sort of thing. But, unhelpfully, it’s actually a noun. An adverbial does the same job as an adverb (modifying a verb or a clause, typically to express manner or time), but it can consist of more than one word.

And “fronted” means it’s at the start of a sentence or clause. That’s not too tricky to grasp, but we hardly use “fronted” to mean that in other contexts, so the phrase feels kind of strange.

Here are some fronted adverbials:

  • Cheerfully, I bit into the apple.
  • Yesterday evening, they went out.
  • Further along the road, a cat was sitting on top of a car.
  • When the going gets tough, the tough get going.

It’s not nearly as hard as it sounds. Whichever linguist came up with the term needs to be docked a week’s pay, but a fronted adverbial is a pretty basic grammatical device that we all use every day.

(The term dates back to the 1960s, but it’s only recently escaped from academia to bother the public at large.)

Well, we never needed to be taught that in my day

Most of us have been confused by why this new term is being taught in primary schools when we as adults have managed perfectly well without it.

I suspect that kids may not find this kind of novel jargon nearly as discombobulating as we adults do. When you’re at school, you expect to learn new things, to be taught new words for new ideas, all the time. “Fronted adverbials”, for all its unintuitive awkwardness, is just another to add to the list. But as adults, we like to think we already know what’s what.

It can be uncomfortable to find your children learning things that you don’t know, but that doesn’t necessarily mean that the new teaching is needless or too advanced, or that you are stupid. It just means that teaching has changed – in this case, because it’s become informed by a better understanding of English grammar.

(One of my nieces, aged five, impressed me by introducing me to the word “digraph”. It means a pair of letters that combine to make a single sound. So in the word ”shoot”, “sh” and “oo” are digraphs. Simple. But the term itself is baffling if you don’t already know it.)

I’m neither a parent nor a teacher, and I don’t have a view about how old children should be when they meet adverbials and their fronting. I would say, though, that this concept should be taught not just as a term to memorise, with bland exercises that require kids to identify or create example sentences on demand. Fronted adverbials are a way of shifting emphasis, of re-ordering information, of changing the rhythm of a sentence. Children should be guided to try out using them, or not, in passages of writing to see the different effects that result. That’s how a knowledge of grammar really helps to improve your writing ability.

And yes, you can achieve that without knowing that there’s this specific term, but having a term can help you identify and think about the concept more tangibly than just “that bit at the end could go at the front”.

The theory can help with the practice. You can walk and run and jump without knowing anything about the muscles and bones in your legs, but if you want to train to get really good, or to deal with the risk of injuries, it’ll help to know a bit about what’s going on inside.

Fronted adverbials are only part of it, though. There’s a lot more grammar in the curriculum than there was in my day, and the grammar taught nowadays is quite different from the grammar that was taught in my parents’ day.

Let’s have a bit of a history lesson.

The decline and fall of the Roman linguistic empire

Social climbers, in their yearning for prestige, often adopt the language of the upper classes – often unconvincingly. The same thing happened to English grammar itself.

A rough, common tongue that grew up on a rainy island on the outskirts of Europe eventually came of age, and it realised that despite its growing literary prowess it still lacked respectability. So it modelled itself on the noblest and most distinguished language of all: Latin.

Sadly, some of the concepts and categories and distinctions used in Latin grammar didn’t fit English very well, but enough of the literati and educational establishment still insisted on adopting them. For this act of vanity, generations of schoolchildren have suffered.

Roman rule in Britain ended before the (Old) English language had even taken hold. But, a millennium later, Rome’s dead language colonised our understanding of our own. It ruled, ineptly but firmly, for a few centuries. Then, after a string of revolts, Latinised English grammar was finally overthrown in the 1960s, and – after a few decades of anarchy – a more authentic understanding of English grammar is now being applied in classrooms.

The new regime has had no shortage of teething problems, and sometimes its proponents implement their concepts with heavy-handed revolutionary zeal. It also has to contend with nostalgics, counter-revolutionaries, and people who just don’t want to have to think about this stuff.

(For a more thorough and less metaphorical telling of this history, try David Crystal’s book Making Sense: The Glamorous Story of English Grammar.)

The aversion to “fronted adverbials” is part of this. It’s a sound concept but it comes across badly. While the theory of grammar is in a much better place than it used to be, it will take a while to figure out how best to present the ideas in schools.

But what’s the point of talking about fronted adverbials?

This morning Michael Rosen, who abominates the current state of grammar teaching, wrote:

Just last week I was privileged to record a radio programme about writing with one of the great modern writers, Hilary Mantel. We talked about the sound and rhythm of sentences, the struggle to find the right word, the shaping of a paragraph so that it sets a scene before introducing a character, and much more. We talked for nearly an hour and we did not mention a fronted adverbial once.

The shaping of a paragraph so that it sets a scene before introducing a character. That’s exactly what Rosen does here, and he uses a fronted adverbial – “just last week” – to do it.

They’re worth knowing about. I just wish they had a better name.

How do you cope when everyone’s usage is wrong?

Princess_Bride_That_WordThe remarkable thing about language change is that it only started happening when I started noticing it. For centuries, English was constant and true, but as soon as I was old enough to have an appreciation of good standards of usage, people around me started falling short. Since then, there has been an alarming, unprecedented surge in rule-breaking.

Neither I nor anyone else really believes any such thing, of course, but some of us sometimes talk as if we do. One such person is Lionel Shriver.

In an article in Harper’s, she wages war on what she calls “semantic drift”. Using the rhetorical style that’s obligatory for such pieces – mock-theatrical (and therefore deniable) moral horror – she rails against “decay”, “degeneration”, “blight”, “barbarism”, “mob rule” and the replacement of “civilised” with “contaminated” English at the hands of “animals”. Shriver’s a fantastic writer, but this kind of thing is just tiring.

The substance of this linguistic apocalypse is, as she sees it, the ignorant modern misuse of words such as literally, nonplussed, notorious, performative and enervated, and the blurring of distinctions such as less/fewer, as/like, who/whom and that/which.

On some of these, I think she has a point. While it’s unlikely anyone will be genuinely confused by “My head literally exploded”, the near-opposite meanings that nonplussed now has make it hard to use reliably. And it’s handy, even if only for formal occasions, to know how to whom. The that/which distinction, on the other hand, is needless. Most Brits (and a good many Americans) are indifferent to it, with no ill effects.

But Shriver’s examples of “semantic drift” also include grammar (flat adverbs and pronoun case) and punctuation (comma splices and indiscriminate dashes), so I guess the word semantic is drifting quite a bit too. She also makes it drift to include pronunciation, claiming that “‘flaccid’ is actually pronounced ‘flak-sid’”. In light of usage, which she accepts is almost entirely ‘flassid’, the meaning of actually must have drifted as well.

OK, that was cheap snark. But it gets us to the heart of the matter: what determines the actual rules of English?

There’s a view that the rules are wholly independent of the usage of English speakers, that the theory is what’s real and true while the practice is at best an approximation and more often a travesty. On this view, usage is evidence of nothing other than failure and corruption.

Nobody really believes any such thing, of course, but some of us sometimes talk as if we do.

The Good Book or the guidebook?

Shriver was raised as a language stickler, and the pedantry she inherited from her parents she reinforced at school. But for her, the ultimate authority, the guide to “official” English, is The Dictionary. She knows that she is fallible – her parents too – and is willing to take corrections when appropriately justified:

Hence when the copy editor on my first novel claimed that there was no such word as “jerry-rig,” I was incensed. Determined to prove her wrong, I went to my trusty, dusty-blue Webster’s Seventh (based on the august Webster’s Third), only to find she was right: “jerry-rig” wasn’t listed. Apparently I’d grown up with a garbled portmanteau of “gerrymander,” “jerry-build,” and the word I really wanted: “jury-rig.” The scales fell from my eyes.

A convert, I explained to my mother her lifelong mistake, but she was having none of it. “Oh, no,” she said gravely. “‘Jury-rig’ refers to rigging a jury, which is very serious.” Explaining the allusion to a “jury mast,” a makeshift sail, with no etymological relationship to a judicial “jury,” got me nowhere. It’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.

But there’s a twist: nowadays, dictionaries list the “incorrect” spelling as standard. “The mob – and my mother – have won.” Shriver, though, isn’t going to budge. Even though recent dictionaries now align with the way most people spell it – and the way Shriver herself long did – she has found her truth and she’s sticking to it, with the zeal of a convert whose prophet has snuck off to the pub.

For Shriver, a dictionary should be a rulebook of almost scriptural immutability. She wants usage to adhere to the rules that she spent time and effort internalising; any deviation, whether by the ignorant masses, by trendy literati or by dictionaries themselves, is to be fought.

The better way to view a dictionary is as a guidebook. It describes the features of the language as you’re likely to encounter it, and it thereby helps you find your way around. To do this, a dictionary needs to record differences in usage and it needs to be able to change.

Don’t just take my word for it, though.

Shriver’s “trusty” Webster’s Seventh New Collegiate Dictionary is a 1963 abridgement based on the “august” full-length Webster’s Third New International Dictionary, which came out in 1961. The Third was not seen as august at the time. In fact, it outraged many contemporary sticklers, who were appalled by its permissive, descriptivist approach. In the preface (the bit that nobody reads), its editor, Philip Gove, wrote that “a definition, to be adequate, must be written only after an analysis of usage”. He concluded:

This new Merriam-Webster unabridged is the record of this language as it is written and spoken. It is offered with confidence that it will supply in full measure that information on the general language which is required for accurate, clear, and comprehensive understanding of the vocabulary of today’s society.

Today’s society. As a new dictionary, it paid no heed to the aggrieved traditions of yesterday’s sticklers. And Gove knew that his work – his guidebook – would have a shelf-life. He knew that some of the language his team mapped would change in years to come. He wouldn’t have wanted the book to treated as scripture almost six decades later.

But that scripture is what Shriver grew up with. That book formed part of the fundamental order of the world as she was honing her command of English, so it’s understandable that departures from it seem like creeping anarchy, like the destruction of something precious – like a “bereavement”, even.

Each generation thinks it invented language change

Maybe I can offer a scrap of consolation. Despite Shriver’s fears, language change definitely isn’t her fault.

Noting that she is more liberal than her father on some matters, such as the meaning of decimate, she says: “my own generation probably instigated this decline in the first place”.

Not guilty. Decimate slipped the bounds of “reduce by one-tenth” to start meaning “destroy a large part of” as early as 1663.

And some of the recent changes that make up her bugbears are not that recent:

  • Notorious, Shriver says, doesn’t just mean “well-known”. But the word dates back to the 15th century, when originally it meant exactly that. Over the years it acquired negative connotations, and for a long time it has mostly been used negatively – but only mostly.
  • She deplores the modern use of quicker as an adverb. But here’s Tennyson in 1865: “Nature… on thy heart a finger lays, Saying ‘Beat quicker’.” Adverbial quick has been in constant use since 1300 – informal, but hardly disreputable.
  • Performative is a term in linguistics, relating to utterances that enact what they state: “I promise”, “I warn you”, “I apologise”. Nowadays most people use it to mean “relating to performance”, but the correct word for that, she says, is performatory. In fact both words have a patchy history. JL Austin coined the technical sense of performative in 1955, but for several years before that he had been using performatory that way. For the performance-related meaning, performative goes back half a century earlier and is the norm today. Despite Shriver’s pessimism, the word’s linguistic meaning is alive and well too – among linguists. Many words comfortably carry more than one meaning, depending on context. We don’t need performatory and we shouldn’t mourn it.

Change didn’t begin with the baby boomers. It’s always been happening (and people have always been complaining about it). The rules Shriver grew up with were simply the customs of the day – some ancient, some much newer. Most are still in place, but the changes stand out. And even when the changes aren’t changes, the realisation that many or most people don’t follow your preferred conventions can be disconcerting.

Our language is part of our culture, our identity. We like things to be done our way, and we like to think that our way has some objective, enduring superiority. So yes, it’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.

Childhood glimpses of a truer reality

talk on the wild sideYou know that electric thrill when something jostles a shapeless old sack full of dim notions in one of the dusty, dark rooms at the back of your head and it rolls overs and its contents somehow rearrange themselves and suddenly coalesce to form a coherent idea that bursts out of the sack, crackling with energy, glowing with light, beautifully new and familiar at the same time?

My brain doesn’t treat me to those moments all that often, but I now owe one to Lane Greene and his new book Talk on the Wild Side: The Untameable Nature of Language.

It’s a great book, lucid and insightful, a riposte to the grumpy sticklers who complain that our language is going to the dogs, who see every change as a sign of decay. Greene rejects their implicit view of English as “an easily threatened logical system for conveying meaning without loss or confusion, which could crumble if we don’t mind it carefully”. Instead, he says, we should realise that language is “a robust, organic and evolving phenomenon that needs relatively little intervention”.

Our language may be untidy, inefficient and imprecise at times, Greene says, but it is hardy, resourceful and adaptable.

To make his case, he explores the natural, social history of language change, the links between language and national identity, and the enormous difficulties of teaching machines to speak. He marvels at the dreamers who have tried to invent logical new languages and despairs at the clueless certainty of amateur grammarians. He even explains the underrated linguistic skills of Donald Trump.

The bit that really made my brain perk up, though, was a single well-placed turn of phrase.

Greene discusses what happens when children first bump into unexpected “rules” of English, rules that clash with the language they’ve picked up from the adults around them. Lots of us can remember a teacher telling us not to say “Can I?” when asking for permission, or not to say “Me and Billy went”…

When children are suddenly told that what they know their parents and virtually everyone else says, and what they have been saying all their lives thus far, is “wrong”, there’s an early disconnect between the child’s native competence and the new idea of an invisible but Platonically correct language out there, one that nobody seems to be using.

As Greene says, the typical reaction to this is puzzlement and even humiliation at being wrong when you’re sure you were right. If you continue to be bruised by encounters with “grammar”, you’ll grow to resent it.

But here’s what this made me think: Not all children react in the same way to those early encounters with the Platonic realm of proper English.

There’s a scene from a David Tennant episode of Doctor Who where the Doctor talks about how the Master first became power-crazed:

When he was a child. That’s when the Master saw eternity. As a novice, he was taken for initiation. He stood in front of the Untempered Schism. It’s a gap in the fabric of reality through which could be seen the whole of the Vortex. You stand there, eight years old, staring at the raw power of time and space, just a child. Some would be inspired, some would run away, and some would go mad.

When kids glimpse Platonic English, some of them realise that they’re supposed to learn how to use it, and they make partly successful efforts to do so. Some are distrustful, thinking it a threat to their own language, and they try to avoid it. And some stare at this elusive, mysterious knowledge – a purer, higher truth that sets the elite few apart from the masses – and they go mad.

Is this the moment when apparently ordinary children turn into budding grammar-peevers? Is this the catalyst that makes a love of knowledge fuse with a respect for authority to produce a desire for linguistic superiority?

When I remember my own early experiences of being “corrected” by teachers, I think I managed to avoid going mad. Just.

I was one of those annoying kids who took pride in being clever, and I had a pedantic streak that nowadays is mostly under control. More charitably: I liked understanding things. And the rules of English were something I could take an interest in. I was at risk of proto-peeverism, and as young as seven or eight my speech and writing were already more formal than the average kid’s (even the average smart-alec kid’s), but I managed to keep my feet on the ground. During my teens I grew out of the idea that formal language was a sign of sophistication and intelligence and maturity.

But some people don’t grow out of it. They grow ever more fixated with the Platonic world, anxious to protect its perfection from the barbarian hordes of real life and zealous to conquer the rest of us in its name.

If they were to read Talk on the Wild Side, they might learn to ease up. Maybe just a little.

Alighting the escalator: a transitive in transit

The other day I saw this sign at a railway station:

Sign at Euston station: “When you alight the escalator keep moving forward”

Sign at Euston station: “When you alight the escalator keep moving forward”

Sorry the photo’s a bit blurry, but funnily enough I was on a moving escalator at the time.

When I saw it I thought: “‘Alight the escalator’? What in the name of Samuel Johnson’s gout-ridden ghost is this sorry abomination? Shouldn’t it be ‘alight from the escalator’?”

I hadn’t ever seen “alight” as a transitive, non-phrasal verb before. So, after apologising to the people I knocked over when I stopped to take the picture, I ran to the dictionaries – Oxford, Collins, Chambers, Macmillan, American Heritage, Merriam-Webster – but none of them had this transitive usage.

The norm is “alight from”, although the OED notes some historical uses of “alight out of”, “alight down from” and “alight off”.

So, is “alight the escalator” a quirk of one sign writer or an up-and-coming innovation? Or maybe even a nonstandard usage that has been rumbling along unnoticed for a long time?

It makes me think of “depart”, which is another official-sounding word strongly associated with transport signs and announcements, and which is increasingly being used transitively without a “from” (compare “depart from the station” and “depart the station”). But from what I can tell, transitive “alight” is a lot rarer than transitive “depart”.

When I started looking into usage of the verb “alight”, I found something surprising: there isn’t much. I already knew it was one of those quaintly formal bits of transport-speak, even more so than “depart”, but I hadn’t realised just how rare it is.

In the Corpus of Contemporary American English (a record of 520 million words used during 1990–2015), I found just 34 instances of “alighting/alighted from the”. There was one transitive, without the “from”.

In the Glowbe corpus of online usage, 774 million words from US and UK websites contained just 16 uses of “alighted/alighting from the” and five of “alighted/alighting the”.

These are tiny numbers.

A search of Google Books produced more examples of both. Uses of “alight/alighting/alighted from the train” massively outnumber those of “alight/alighting/alighted the train”. But still, the vast majority of the latter were results like “after you alight, the train will pull away” and “I should not have alighted. The train was gone…”

Most of the definite examples of transitive “alight the train” that I could find were post-2000. The earliest was in a US Congressional committee report from 1975.

So it seems that transitive “alight” is flitting around the edges of usage. If it starts to catch on, it will find itself pursued by a mob of angry pedants with plasticine pitchforks.

Like it or not (and I don’t), I think it could spread fast. When a word’s use is dominated by a small group – such as railway staff – there’s more scope for idiosyncrasy in that group to change that use. If signs like the one I saw appear across the country, and if announcements follow suit, that will set the tone for the rest of us.

But who knows? Anyway, there’s a bit more to this story: the beginning.

The earliest “alight from” in the OED’s records is from 1477, in William Caxton’s translation of Raoul Le Fèvre: “Peleus and Iason were alighted from their hors.” But “alight of” and other variations go back all the way to Old English, the first in the OED being from Ælfric of Eynsham in the late 900s: “Ic geseah þurh Godes gast, þa se þegen alihte of his cræte” (I saw through the Spirit of God that the officer alighted of his chariot).

And here’s the funny thing. Back then, there was a transitive sense of “alight”. It’s been obsolete since the 1600s, but it meant “to make light, or less heavy; to lighten, alleviate (a burden); to relieve (a person) of a burden”.

Here’s an early example, also from Ælfric: “a ealdan cyningas on ðam aerran timan hogodon hu hi mihton heora byrðena alihtan forþan ðe an man ne mæg æghwar beon” (the old kings in previous times thought about how they might alight their burden, for one man cannot be everywhere).

And one from Reginald Pecock in 1449: “for this cause of aliȝting the poor men it is alloweable and profitable, that lordis and ladies haue mansiouns with inne the cloocis gatis”.

These two are figurative uses, but it could be literal as well: if you got off your horse, you would make it lighter; you would relieve it of its burden; you would alight it.

And if there had been an escalator around for you to get off, you would have alighted that too.

If the rule you followed brought you to this, of what use was the rule?

The only thing as bizarre and horrifying as the Trump administration’s loudening belches of vicious, incompetent corruption is the coverage thereof in the New Yorker. Specifically, the punctuation.

You may have seen this headline:

Donald Trump, Jr.,’s Love for Russian Dirt

Period, comma, apostrophe. What a grotesque sight for these three punctuation marks to be colluding so brazenly. Why did this horror happen?

The New Yorker’s Andrew Boynton explains:

The reasoning for the punctuation of “Jr.,’s” is pretty straightforward. It’s a collision of conventions. The first convention is The New Yorker’s—we place a comma before “Jr.” Doing so leads to another of our conventions: when something like “Jr.” occurs in the middle of a phrase, clause, or sentence, it is set off by its preceding comma and a following comma. Thus: “Ed Begley, Jr., was in ‘St. Elsewhere.’” A third convention is one that we all accept: the possessive is indicated by the addition of an apostrophe and “s.” We (the magazine) like our punctuation; we set things off with commas a lot; it drives some people nuts (i.e., it’s “bullshit”). This reaction is not surprising; it is also not new. With “Jr.” occurring in the middle of a line, where else is the possessive indicator supposed to go?

I’m happy to tell them where to stick it.

It’s silly

Boynton’s case all sounds very logical, but likewise it’s very silly. The New Yorker is a magnificent publication, and one of the most carefully copyedited in the world, but some of its style conventions are odd – and applied too zealously.

A house style on punctuation (or any other aspect of language) exists to help the reader get at what the writer is saying. Consistency is a big part of that: if an article’s comma style varies from one paragraph to another, it can be disorienting and distracting.

But English is a magical beast, and the ways we can ride it are myriad. This means that too much consistency can get in the way. Sometimes a mostly reasonable style rule can cause trouble, and the pile-up that is “Jr.,’s” is one of those times. In these cases, a wise copyeditor will break the rule – or find a way to avoid it.

Also, it’s wrong

Oh yes, and neither comma should be there anyway, regardless of the apostrophe. A New Yorker convention this may be, but the pair of them are confusing and misleading. To explain, I’ll rely on the magazine’s wonderful “Comma Queen”, Mary Norris.

She has repeatedly defended the first and second commas in this notorious New Yorker sentence: “Before Atwater died, of brain cancer, in 1991, he expressed regret.” Critics think these two commas needlessly fussy, but Norris says:

I took a good, hard look at the magazine’s policy, and I persuaded myself that in fact these commas were not indiscriminate. They marked off segments of the sentence that were not germane to the meaning. The point of the sentence… is that Atwater expressed regret before he died. What he died of and when he died of it are both extra details that the author, Jane Mayer, provides only to satisfy the reader’s curiosity. They aren’t essential to the meaning of the sentence. They are nonrestrictive.

I don’t wholly agree with this policy, but Norris’s reasoning is clear. The two phrases bracketed by commas – “of brain cancer” and “in 1991” – are inessential asides. You could cut them and the main statement would be intact: “Before Atwater died, he expressed regret.”

With that in mind, let’s look at the Trump headline again:

Donald Trump, Jr.,’s Love for Russian Dirt

Here, “Jr.” is bracketed by the pair of commas. It looks like one of those inessential, nonrestrictive asides. So let’s try cutting it to see if it leaves the main part intact:

Donald Trump’s Love for Russian Dirt

Nope. When there are two Donald Trumps in a story, “Jr.” and “Sr.” are absolutely essential. But the commas suggest otherwise.

Perhaps there is a line in the New Yorker style guide saying, “Oh, but in this sort of case, the commas are working differently.” But, if so, the New Yorker readers don’t know this.

One of the lessons it took me a while to learn as a copyeditor is that you must work to satisfy readers who neither know nor care what your style guide says. If instead you work – however accurately – to correspond to a geekily intricate but internally consistent set of arcane rules, you are failing. You are working to satisfy yourself and your peers.

Those commas, like the Donald Trumps, are a repellent pair that I wish would appear in the news less often.

Paul Romer and the World Bank and conjunctions and brevity and bad targets

The World Bank’s chief economist, Paul Romer, has been demanding that his staff use the word “and” less.

Why? It’s such an innocuous little word.

This is his thinking:

Circulating a draft of the upcoming World Development Report, Mr Romer warned against bank staff trying to pile their own pet projects and messages into the report. The tendency, he argued, had diluted the impact of past reports and led to a proliferation of “ands”.

“Because of this type of pressure to say that our message is ‘this, and this, and this too, and that …’ the word ‘and’ has become the most frequently used word in Bank prose,” he complained in an email.

Romer mentions a study of World Bank writing, which highlights this phrase as an example:

emphasis on quality, responsiveness, and partnerships; on knowledge-sharing and client orientation; and on poverty reduction

This is 16 words (counting “knowledge-sharing” as two), of which three are “and” – 18.75%. That’s very high.

I agree that a lot of corporate writing – like this quoted phrase – tries to say too much, making it clunky, boring and unfocused. But a mechanical focus on the word “and” is the wrong way to fix this.

Imagine you’re a junior member of Romer’s team. As instructed, you’re trying to reduce the “and”s in that phrase. The simplest thing would be to change it to this:

emphasis on quality, responsiveness, partnerships, knowledge-sharing, client orientation and poverty reduction

Now you only have one “and” in 11 words – just 9.1%. A great improvement!

Except that the whole waffly list of things is still there. You may have hit the target but you’ve missed the point.

The point is to say less. So instead, you talk to your colleagues to figure out what’s truly important to keep in that list, and you all agree that it only needs to emphasise the first two points. So you can cut it down to this:

emphasis on quality and responsiveness

This is a great improvement. Shorter and much better-focused.

There’s only one problem: five words, one “and” – you’re up to 20%. Romer will not be happy. But Romer will be wrong.

Word By Word: The Secret Life of Dictionaries – by Kory Stamper

word by wordIt turns out that dictionaries don’t just coalesce out of the antique dust in academic libraries, hardening on the shelves into compendia of immutable, authoritative Truth. Human beings write them.

There are actually real people who pay intense attention to words and their uses, who record and catalogue these uses, who spend hours and days and decades sifting these uses and carefully analysing them – a different shade of meaning here, a grammatical variation there – and who then try to encapsulate these analyses in succinct, helpful definitions, so that chumps like me can one day look them up and announce: “Well, according to The Dictionary…”

Real people do this for a living. One of them is Kory Stamper, who has written a beautiful, fascinating, witty, loving, irreverent book about the life of the lexicographer.

In Word By Word, she tells us about her career at Merriam-Webster and the colleagues who wrestle with pronunciation, etymology and how to socialise with the extroverts from marketing. She tells us about the silent frenzies of defining that rage inside their cubicles, about how lexicography is not just a job but a way of life. She shares the joys of handling correspondence from a polite and well-informed public (hashtag sarcasm), and she talks us through some of the words that have caused the most trouble and how they were eventually brought to book.

How do you handle racial bias in explaining the meaning of “nude”-coloured pantyhose? Where can you look to find out where “posh” really came from? How do you tease apart the many, many senses of “take” without losing your mind? What are and aren’t the essential features of a “surfboard”? What exactly was the problem with an old definition of “bitch”? And how do you convince an angry mob that “irregardless” is a word whether they like it or not?

She explains what dictionaries really do and busts myths about what they don’t do. And she unearths some gems from the history of lexicography, spanning Samuel Johnson, Noah Webster and what happens when rival dictionaries fall out.

And she makes it all such damned good fun.

I’m a copy editor who blogs about language, and I like to think of myself as a word nerd. But I’m not. Kory, who toils endlessly on the linguistic equivalent of the Human Genome Project, is the very definition of a word nerd.

You should look her up.

Full disclosure: I slightly know Kory, and she bought me a beer once. So, for all you know, I’m biased and possibly drunk. Here are some other reviews by proper people: Megan Garber (Atlantic), Stan Carey (Sentence First), Jennifer Schuessler (New York Times), Caitlin PenzeyMoog (AV Club), Stevie Godson (New York Journal of Books).