Category Archives: Uncategorized

Bad grammar, bugbears and dæmons

Dafne Keen as Lyra with Pantalaimon, from the BBC adaptation of Philip Pullman’s His Dark Materials.

Susan McDonald, an experienced subeditor at the Guardian, has written an article that appears to be about grammar and usage but is really about everyone’s favourite topic: how annoying other people are.

McDonald doubtless knows more than a thing or two about whipping ungainly sentences into shape. Her daily work involves tweaking punctuation, replacing clichés, shepherding stray verbs towards their subjects, and making all sorts of other small changes that smooth the path from the writer’s brain to the reader’s.

But she says she doesn’t nitpick for the sake of it, instead using common sense to decide when rules can be bent. I agree with that as a broad principle, but the thing about common sense is that it’s never as common as you think. What strikes one person as sensible flexibility will strike another as sloppy inconsistency; one person’s high standards will be another’s restrictive dogmatism.

McDonald gives some examples of things that definitely do matter (to her):

Some of my personal bugbears come up more regularly than others.

“Compared to” is different from “compared with”. Use the first if you want to liken one thing to another, the second to contrast.

And that reminds me: in my book “different from” is correct, “different to” and “different than” are not.

“Who” and “whom” are both needed but not interchangeable. The same goes for less/fewer, like/such as and imply/infer.

As a copyeditor, I think I would be absolutist about only one of these six. For moderately formal pieces, I’d probably apply three or four more of them across the board and the other one or two depending on context.

But I would also usually avoid using ‘regularly’ to mean ‘frequently’, as McDonald does here – so on that point I’m more of a stickler than her. And there are people who would scorn both of us for beginning sentences with ‘and’.

There’s no objective way of telling which ‘rules’ are the ‘correct’ ones. Any of us can talk about what’s right ‘in my book’ – but each of our mental rulebooks is different.

Some people respond to these differences by saying that the safest thing to do is always follow all the rules – that way, you won’t annoy anyone. But a lot of these (alleged) rules are, to put it politely, stupid. Picking every conceivable potential nit would be enormously time-consuming and make prose awkward, stiff – even annoying. McDonald rightly points out that, for instance, split infinitives and sentence-ending prepositions often produce better results.

A lot of these decisions are judgement calls. You have to think about audience, medium and desired effect. You have to keep abreast of how people are using the language and how they expect it to be used. You have to estimate which fine distinctions are too fine to be reliably clear, and you have to have a nose for what might be likely to cause a stink.

You also have to remember that the people who complain about ‘broken rules’ are far louder than those whose reading is eased by a certain breakage – but that doesn’t mean the loud ones are anything like a majority.

Sometimes there’s no right answer. Language isn’t like mathematics; it’s like life.

McDonald describes her linguistic gripes as bugbears, and many people talk semi-fondly of having pet peeves, but really these attitudes are more like Philip Pullman’s dæmons – they’re aspects of ourselves. They are changeable during childhood but become fixed as we grow up. They might cause us annoyance, but they are a dear, cherished part of who we are, and any attempt to separate them from us causes terrible pain.

The last line of McDonald’s piece is:

Language reflects – and can even define – who we are. So a little respect, please, for its rules.

It’s not just language but also our attitudes to language that are part of our identities. But the ‘rules’ we get the most righteously angry about don’t belong to language in itself. They belong to our personal conception of it. And when we meet someone whose internal rules are frustratingly different, we have two options: banish their dæmon or pacify our own.

Neither is easy.

Why can’t the English learn to speak about English?

Rex Harrison and Audrey Hepburn in My Fair Lady

Clare Foges has written a passionate column in the Times about the dangers posed by linguistic prejudice.

She highlights evidence of discrimination against people, especially young people, whose speech doesn’t fit in. Civil servants who lack the “right accent” are less likely to get promoted, even if they do good work; in industries from law to accountancy, those who don’t sound middle-class enough are less likely to be hired, even if they have good grades; and many professionals give lower ratings to answers delivered in certain varieties of English, even when the content of those answers is good.

As Foges says, people who speak dialects other than standard English “are surrounded by invisible barriers to success, yet we as a nation are too squeamish to say anything about it”. I commend her for raising this injustice, and for her directness in saying that “We do young people no favours by pretending that the way we speak doesn’t matter any more, because it does.” Collectively, we do need to get better at talking about our language and the role it plays in society.

The current situation isn’t just bad for those who are looked down on because of the way they talk. It’s bad for all of us. It means that the legal system is needlessly denying itself valuable talent. It means that the accountancy firms that businesses rely on are carelessly tossing out some of the best recruits available. And it means that we, the taxpayers, are getting poor value for money when mediocrities get ahead in the civil service while more capable public servants are overlooked.

How, then, can we combat this economically and socially ruinous linguistic prejudice? Foges has a simple solution: elocution lessons.

Wait, what?

Yes, she wants to give young people “speech coaching”, to endow them with the diction of the middle classes and the grammar of standard English. The best way to deal with prejudice, it seems, is to eliminate its object.

She briefly considers an alternative proposal – that linguistically prejudiced people should “challenge their biases” – but she isn’t interested. Because, you see, she agrees with them that standard English is better than other dialects. Here’s her argument:

Standard English is best because (the clue is in the name) it is the standard, with rules the vast majority understand. It is the medium through which writers and speakers of the language can achieve maximum clarity and minimum confusion. This is why deviating from it can grate. If people speak sloppily, mangling their grammar and failing to enunciate their words properly, language turns from a window between souls into a wall between them — and swiftly, subconsciously, we label the speaker.

In the spirit of diplomacy, I will try to meet Foges partway on this.

I agree it is important for children to learn standard English – but not because it’s better than other dialects or more precise or more expressive. It’s useful to know because it’s widely used in the public sphere, in business, in academia… in the kind of professions traditionally dominated by people who grew up in well-to-do families that speak standard English.

But standard English didn’t achieve its high social status in some merit-based competition, beating other dialects because of its intrinsically superior grammatical conventions and vocabulary. It got where it is because of William the Conqueror.

Before 1066, the capital of England was Winchester, which had previously been part of the Kingdom of Wessex. Back then, regional differences in English were much bigger than they are today, but the West Saxon variety – spoken in Winchester, the seat of power – looked set to become top dog. After the Norman Conquest, though, the capital was moved to London. The locally spoken Mercian became the high-status dialect, and out of it grew what we now call standard English.

While I bear our friends and neighbours in France no ill will for that ancient act of aggression, I cannot fathom why we in England should let a long-dead Frenchman determine which variety of our own language is best.

Standard English is only standard because of an accident of history. And different dialects are not “deviations” from it, they’re not “sloppy” or “mangled” attempts to speak it: they’re just different. What they lack is not clarity but prestige.

Yes, kids should learn standard English, but they should also learn that English is a family of dialects, related to region, class, and more recently ethnicity. All of these dialects change, including standard English; they influence each other and their borders overlap. (Foges complains about the adoption of Multicultural London English by “teenagers in country towns who desperately want to appear cool”, but changing the way you talk to make yourself come across a certain way is exactly the policy she recommends.)

But if we want to reduce the ill-effects of linguistic prejudice, there’s another side of the coin. Those of us who are perfectly at home with standard English should be careful how much importance we attach to whether others speak like us, especially if we’re in positions where we can help others to advance.

I am an editor. It is literally my job to improve other people’s writing – pretty much always in standard English. I fix typos, I substitute words, I tweak grammar, I rejig paragraphs, trying to help my colleagues come across as clearly and effectively as possible. What I absolutely do not do is judge those of them who are less comfortable than me with the niceties of standard English.

I’ve worked with a lot of people who are good at all sorts of things – and some of them not so good – but I’ve never noticed that their skills and commitment have any correlation with their dialect, accent and enunciation.

So this attitude described by Foges is a lamentable mistake:

When someone says “could of” instead of “could have”, or “pacifically” instead of “specifically”, or “froo” instead of “through”, they are labelled. The interviewer labels them “not sharp enough”. The colleague labels them “not up to it”.

It’s not quite clear whether she shares this attitude herself, but it’s damn clear that she’s not going to do anything to oppose it.

Foges ends by quoting Henry Higgins talking to Eliza Doolittle:

Think what you’re dealing with. The majesty and grandeur of the English language, the greatest possession we have. The noblest thoughts that ever flowed through the hearts of men are contained in its extraordinary, imaginative, and musical mixtures of sounds.

But Eliza’s English is no less English than Henry’s. If all our speech were standardised, those mixtures of sounds would be so much less extraordinary, imaginative, and musical.

Fronted adverbials: what the hell is going on with English grammar?

Every so often a kerfuffle erupts about the teaching of grammar in English schools, and the focal point these days is often the ugly term “fronted adverbials”. What on earth are these obscure things, people wonder, and why are young children being forced to learn about them?

Two daunting words, one simple concept

Let’s start with “adverbial”. It looks like an adjective, doesn’t it? “Of or relating to adverbs”, that sort of thing. But, unhelpfully, it’s actually a noun. An adverbial does the same job as an adverb (modifying a verb or a clause, typically to express manner or time), but it can consist of more than one word.

And “fronted” means it’s at the start of a sentence or clause. That’s not too tricky to grasp, but we hardly use “fronted” to mean that in other contexts, so the phrase feels kind of strange.

Here are some fronted adverbials:

  • Cheerfully, I bit into the apple.
  • Yesterday evening, they went out.
  • Further along the road, a cat was sitting on top of a car.
  • When the going gets tough, the tough get going.

It’s not nearly as hard as it sounds. Whichever linguist came up with the term needs to be docked a week’s pay, but a fronted adverbial is a pretty basic grammatical device that we all use every day.

(The term dates back to the 1960s, but it’s only recently escaped from academia to bother the public at large.)

Well, we never needed to be taught that in my day

Most of us have been confused by why this new term is being taught in primary schools when we as adults have managed perfectly well without it.

I suspect that kids may not find this kind of novel jargon nearly as discombobulating as we adults do. When you’re at school, you expect to learn new things, to be taught new words for new ideas, all the time. “Fronted adverbials”, for all its unintuitive awkwardness, is just another to add to the list. But as adults, we like to think we already know what’s what.

It can be uncomfortable to find your children learning things that you don’t know, but that doesn’t necessarily mean that the new teaching is needless or too advanced, or that you are stupid. It just means that teaching has changed – in this case, because it’s become informed by a better understanding of English grammar.

(One of my nieces, aged five, impressed me by introducing me to the word “digraph”. It means a pair of letters that combine to make a single sound. So in the word ”shoot”, “sh” and “oo” are digraphs. Simple. But the term itself is baffling if you don’t already know it.)

I’m neither a parent nor a teacher, and I don’t have a view about how old children should be when they meet adverbials and their fronting. I would say, though, that this concept should be taught not just as a term to memorise, with bland exercises that require kids to identify or create example sentences on demand. Fronted adverbials are a way of shifting emphasis, of re-ordering information, of changing the rhythm of a sentence. Children should be guided to try out using them, or not, in passages of writing to see the different effects that result. That’s how a knowledge of grammar really helps to improve your writing ability.

And yes, you can achieve that without knowing that there’s this specific term, but having a term can help you identify and think about the concept more tangibly than just “that bit at the end could go at the front”.

The theory can help with the practice. You can walk and run and jump without knowing anything about the muscles and bones in your legs, but if you want to train to get really good, or to deal with the risk of injuries, it’ll help to know a bit about what’s going on inside.

Fronted adverbials are only part of it, though. There’s a lot more grammar in the curriculum than there was in my day, and the grammar taught nowadays is quite different from the grammar that was taught in my parents’ day.

Let’s have a bit of a history lesson.

The decline and fall of the Roman linguistic empire

Social climbers, in their yearning for prestige, often adopt the language of the upper classes – often unconvincingly. The same thing happened to English grammar itself.

A rough, common tongue that grew up on a rainy island on the outskirts of Europe eventually came of age, and it realised that despite its growing literary prowess it still lacked respectability. So it modelled itself on the noblest and most distinguished language of all: Latin.

Sadly, some of the concepts and categories and distinctions used in Latin grammar didn’t fit English very well, but enough of the literati and educational establishment still insisted on adopting them. For this act of vanity, generations of schoolchildren have suffered.

Roman rule in Britain ended before the (Old) English language had even taken hold. But, a millennium later, Rome’s dead language colonised our understanding of our own. It ruled, ineptly but firmly, for a few centuries. Then, after a string of revolts, Latinised English grammar was finally overthrown in the 1960s, and – after a few decades of anarchy – a more authentic understanding of English grammar is now being applied in classrooms.

The new regime has had no shortage of teething problems, and sometimes its proponents implement their concepts with heavy-handed revolutionary zeal. It also has to contend with nostalgics, counter-revolutionaries, and people who just don’t want to have to think about this stuff.

(For a more thorough and less metaphorical telling of this history, try David Crystal’s book Making Sense: The Glamorous Story of English Grammar.)

The aversion to “fronted adverbials” is part of this. It’s a sound concept but it comes across badly. While the theory of grammar is in a much better place than it used to be, it will take a while to figure out how best to present the ideas in schools.

But what’s the point of talking about fronted adverbials?

This morning Michael Rosen, who abominates the current state of grammar teaching, wrote:

Just last week I was privileged to record a radio programme about writing with one of the great modern writers, Hilary Mantel. We talked about the sound and rhythm of sentences, the struggle to find the right word, the shaping of a paragraph so that it sets a scene before introducing a character, and much more. We talked for nearly an hour and we did not mention a fronted adverbial once.

The shaping of a paragraph so that it sets a scene before introducing a character. That’s exactly what Rosen does here, and he uses a fronted adverbial – “just last week” – to do it.

They’re worth knowing about. I just wish they had a better name.

How do you cope when everyone’s usage is wrong?

Princess_Bride_That_WordThe remarkable thing about language change is that it only started happening when I started noticing it. For centuries, English was constant and true, but as soon as I was old enough to have an appreciation of good standards of usage, people around me started falling short. Since then, there has been an alarming, unprecedented surge in rule-breaking.

Neither I nor anyone else really believes any such thing, of course, but some of us sometimes talk as if we do. One such person is Lionel Shriver.

In an article in Harper’s, she wages war on what she calls “semantic drift”. Using the rhetorical style that’s obligatory for such pieces – mock-theatrical (and therefore deniable) moral horror – she rails against “decay”, “degeneration”, “blight”, “barbarism”, “mob rule” and the replacement of “civilised” with “contaminated” English at the hands of “animals”. Shriver’s a fantastic writer, but this kind of thing is just tiring.

The substance of this linguistic apocalypse is, as she sees it, the ignorant modern misuse of words such as literally, nonplussed, notorious, performative and enervated, and the blurring of distinctions such as less/fewer, as/like, who/whom and that/which.

On some of these, I think she has a point. While it’s unlikely anyone will be genuinely confused by “My head literally exploded”, the near-opposite meanings that nonplussed now has make it hard to use reliably. And it’s handy, even if only for formal occasions, to know how to whom. The that/which distinction, on the other hand, is needless. Most Brits (and a good many Americans) are indifferent to it, with no ill effects.

But Shriver’s examples of “semantic drift” also include grammar (flat adverbs and pronoun case) and punctuation (comma splices and indiscriminate dashes), so I guess the word semantic is drifting quite a bit too. She also makes it drift to include pronunciation, claiming that “‘flaccid’ is actually pronounced ‘flak-sid’”. In light of usage, which she accepts is almost entirely ‘flassid’, the meaning of actually must have drifted as well.

OK, that was cheap snark. But it gets us to the heart of the matter: what determines the actual rules of English?

There’s a view that the rules are wholly independent of the usage of English speakers, that the theory is what’s real and true while the practice is at best an approximation and more often a travesty. On this view, usage is evidence of nothing other than failure and corruption.

Nobody really believes any such thing, of course, but some of us sometimes talk as if we do.

The Good Book or the guidebook?

Shriver was raised as a language stickler, and the pedantry she inherited from her parents she reinforced at school. But for her, the ultimate authority, the guide to “official” English, is The Dictionary. She knows that she is fallible – her parents too – and is willing to take corrections when appropriately justified:

Hence when the copy editor on my first novel claimed that there was no such word as “jerry-rig,” I was incensed. Determined to prove her wrong, I went to my trusty, dusty-blue Webster’s Seventh (based on the august Webster’s Third), only to find she was right: “jerry-rig” wasn’t listed. Apparently I’d grown up with a garbled portmanteau of “gerrymander,” “jerry-build,” and the word I really wanted: “jury-rig.” The scales fell from my eyes.

A convert, I explained to my mother her lifelong mistake, but she was having none of it. “Oh, no,” she said gravely. “‘Jury-rig’ refers to rigging a jury, which is very serious.” Explaining the allusion to a “jury mast,” a makeshift sail, with no etymological relationship to a judicial “jury,” got me nowhere. It’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.

But there’s a twist: nowadays, dictionaries list the “incorrect” spelling as standard. “The mob – and my mother – have won.” Shriver, though, isn’t going to budge. Even though recent dictionaries now align with the way most people spell it – and the way Shriver herself long did – she has found her truth and she’s sticking to it, with the zeal of a convert whose prophet has snuck off to the pub.

For Shriver, a dictionary should be a rulebook of almost scriptural immutability. She wants usage to adhere to the rules that she spent time and effort internalising; any deviation, whether by the ignorant masses, by trendy literati or by dictionaries themselves, is to be fought.

The better way to view a dictionary is as a guidebook. It describes the features of the language as you’re likely to encounter it, and it thereby helps you find your way around. To do this, a dictionary needs to record differences in usage and it needs to be able to change.

Don’t just take my word for it, though.

Shriver’s “trusty” Webster’s Seventh New Collegiate Dictionary is a 1963 abridgement based on the “august” full-length Webster’s Third New International Dictionary, which came out in 1961. The Third was not seen as august at the time. In fact, it outraged many contemporary sticklers, who were appalled by its permissive, descriptivist approach. In the preface (the bit that nobody reads), its editor, Philip Gove, wrote that “a definition, to be adequate, must be written only after an analysis of usage”. He concluded:

This new Merriam-Webster unabridged is the record of this language as it is written and spoken. It is offered with confidence that it will supply in full measure that information on the general language which is required for accurate, clear, and comprehensive understanding of the vocabulary of today’s society.

Today’s society. As a new dictionary, it paid no heed to the aggrieved traditions of yesterday’s sticklers. And Gove knew that his work – his guidebook – would have a shelf-life. He knew that some of the language his team mapped would change in years to come. He wouldn’t have wanted the book to treated as scripture almost six decades later.

But that scripture is what Shriver grew up with. That book formed part of the fundamental order of the world as she was honing her command of English, so it’s understandable that departures from it seem like creeping anarchy, like the destruction of something precious – like a “bereavement”, even.

Each generation thinks it invented language change

Maybe I can offer a scrap of consolation. Despite Shriver’s fears, language change definitely isn’t her fault.

Noting that she is more liberal than her father on some matters, such as the meaning of decimate, she says: “my own generation probably instigated this decline in the first place”.

Not guilty. Decimate slipped the bounds of “reduce by one-tenth” to start meaning “destroy a large part of” as early as 1663.

And some of the recent changes that make up her bugbears are not that recent:

  • Notorious, Shriver says, doesn’t just mean “well-known”. But the word dates back to the 15th century, when originally it meant exactly that. Over the years it acquired negative connotations, and for a long time it has mostly been used negatively – but only mostly.
  • She deplores the modern use of quicker as an adverb. But here’s Tennyson in 1865: “Nature… on thy heart a finger lays, Saying ‘Beat quicker’.” Adverbial quick has been in constant use since 1300 – informal, but hardly disreputable.
  • Performative is a term in linguistics, relating to utterances that enact what they state: “I promise”, “I warn you”, “I apologise”. Nowadays most people use it to mean “relating to performance”, but the correct word for that, she says, is performatory. In fact both words have a patchy history. JL Austin coined the technical sense of performative in 1955, but for several years before that he had been using performatory that way. For the performance-related meaning, performative goes back half a century earlier and is the norm today. Despite Shriver’s pessimism, the word’s linguistic meaning is alive and well too – among linguists. Many words comfortably carry more than one meaning, depending on context. We don’t need performatory and we shouldn’t mourn it.

Change didn’t begin with the baby boomers. It’s always been happening (and people have always been complaining about it). The rules Shriver grew up with were simply the customs of the day – some ancient, some much newer. Most are still in place, but the changes stand out. And even when the changes aren’t changes, the realisation that many or most people don’t follow your preferred conventions can be disconcerting.

Our language is part of our culture, our identity. We like things to be done our way, and we like to think that our way has some objective, enduring superiority. So yes, it’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.

Childhood glimpses of a truer reality

talk on the wild sideYou know that electric thrill when something jostles a shapeless old sack full of dim notions in one of the dusty, dark rooms at the back of your head and it rolls overs and its contents somehow rearrange themselves and suddenly coalesce to form a coherent idea that bursts out of the sack, crackling with energy, glowing with light, beautifully new and familiar at the same time?

My brain doesn’t treat me to those moments all that often, but I now owe one to Lane Greene and his new book Talk on the Wild Side: The Untameable Nature of Language.

It’s a great book, lucid and insightful, a riposte to the grumpy sticklers who complain that our language is going to the dogs, who see every change as a sign of decay. Greene rejects their implicit view of English as “an easily threatened logical system for conveying meaning without loss or confusion, which could crumble if we don’t mind it carefully”. Instead, he says, we should realise that language is “a robust, organic and evolving phenomenon that needs relatively little intervention”.

Our language may be untidy, inefficient and imprecise at times, Greene says, but it is hardy, resourceful and adaptable.

To make his case, he explores the natural, social history of language change, the links between language and national identity, and the enormous difficulties of teaching machines to speak. He marvels at the dreamers who have tried to invent logical new languages and despairs at the clueless certainty of amateur grammarians. He even explains the underrated linguistic skills of Donald Trump.

The bit that really made my brain perk up, though, was a single well-placed turn of phrase.

Greene discusses what happens when children first bump into unexpected “rules” of English, rules that clash with the language they’ve picked up from the adults around them. Lots of us can remember a teacher telling us not to say “Can I?” when asking for permission, or not to say “Me and Billy went”…

When children are suddenly told that what they know their parents and virtually everyone else says, and what they have been saying all their lives thus far, is “wrong”, there’s an early disconnect between the child’s native competence and the new idea of an invisible but Platonically correct language out there, one that nobody seems to be using.

As Greene says, the typical reaction to this is puzzlement and even humiliation at being wrong when you’re sure you were right. If you continue to be bruised by encounters with “grammar”, you’ll grow to resent it.

But here’s what this made me think: Not all children react in the same way to those early encounters with the Platonic realm of proper English.

There’s a scene from a David Tennant episode of Doctor Who where the Doctor talks about how the Master first became power-crazed:

When he was a child. That’s when the Master saw eternity. As a novice, he was taken for initiation. He stood in front of the Untempered Schism. It’s a gap in the fabric of reality through which could be seen the whole of the Vortex. You stand there, eight years old, staring at the raw power of time and space, just a child. Some would be inspired, some would run away, and some would go mad.

When kids glimpse Platonic English, some of them realise that they’re supposed to learn how to use it, and they make partly successful efforts to do so. Some are distrustful, thinking it a threat to their own language, and they try to avoid it. And some stare at this elusive, mysterious knowledge – a purer, higher truth that sets the elite few apart from the masses – and they go mad.

Is this the moment when apparently ordinary children turn into budding grammar-peevers? Is this the catalyst that makes a love of knowledge fuse with a respect for authority to produce a desire for linguistic superiority?

When I remember my own early experiences of being “corrected” by teachers, I think I managed to avoid going mad. Just.

I was one of those annoying kids who took pride in being clever, and I had a pedantic streak that nowadays is mostly under control. More charitably: I liked understanding things. And the rules of English were something I could take an interest in. I was at risk of proto-peeverism, and as young as seven or eight my speech and writing were already more formal than the average kid’s (even the average smart-alec kid’s), but I managed to keep my feet on the ground. During my teens I grew out of the idea that formal language was a sign of sophistication and intelligence and maturity.

But some people don’t grow out of it. They grow ever more fixated with the Platonic world, anxious to protect its perfection from the barbarian hordes of real life and zealous to conquer the rest of us in its name.

If they were to read Talk on the Wild Side, they might learn to ease up. Maybe just a little.

Alighting the escalator: a transitive in transit

The other day I saw this sign at a railway station:

Sign at Euston station: “When you alight the escalator keep moving forward”

Sign at Euston station: “When you alight the escalator keep moving forward”

Sorry the photo’s a bit blurry, but funnily enough I was on a moving escalator at the time.

When I saw it I thought: “‘Alight the escalator’? What in the name of Samuel Johnson’s gout-ridden ghost is this sorry abomination? Shouldn’t it be ‘alight from the escalator’?”

I hadn’t ever seen “alight” as a transitive, non-phrasal verb before. So, after apologising to the people I knocked over when I stopped to take the picture, I ran to the dictionaries – Oxford, Collins, Chambers, Macmillan, American Heritage, Merriam-Webster – but none of them had this transitive usage.

The norm is “alight from”, although the OED notes some historical uses of “alight out of”, “alight down from” and “alight off”.

So, is “alight the escalator” a quirk of one sign writer or an up-and-coming innovation? Or maybe even a nonstandard usage that has been rumbling along unnoticed for a long time?

It makes me think of “depart”, which is another official-sounding word strongly associated with transport signs and announcements, and which is increasingly being used transitively without a “from” (compare “depart from the station” and “depart the station”). But from what I can tell, transitive “alight” is a lot rarer than transitive “depart”.

When I started looking into usage of the verb “alight”, I found something surprising: there isn’t much. I already knew it was one of those quaintly formal bits of transport-speak, even more so than “depart”, but I hadn’t realised just how rare it is.

In the Corpus of Contemporary American English (a record of 520 million words used during 1990–2015), I found just 34 instances of “alighting/alighted from the”. There was one transitive, without the “from”.

In the Glowbe corpus of online usage, 774 million words from US and UK websites contained just 16 uses of “alighted/alighting from the” and five of “alighted/alighting the”.

These are tiny numbers.

A search of Google Books produced more examples of both. Uses of “alight/alighting/alighted from the train” massively outnumber those of “alight/alighting/alighted the train”. But still, the vast majority of the latter were results like “after you alight, the train will pull away” and “I should not have alighted. The train was gone…”

Most of the definite examples of transitive “alight the train” that I could find were post-2000. The earliest was in a US Congressional committee report from 1975.

So it seems that transitive “alight” is flitting around the edges of usage. If it starts to catch on, it will find itself pursued by a mob of angry pedants with plasticine pitchforks.

Like it or not (and I don’t), I think it could spread fast. When a word’s use is dominated by a small group – such as railway staff – there’s more scope for idiosyncrasy in that group to change that use. If signs like the one I saw appear across the country, and if announcements follow suit, that will set the tone for the rest of us.

But who knows? Anyway, there’s a bit more to this story: the beginning.

The earliest “alight from” in the OED’s records is from 1477, in William Caxton’s translation of Raoul Le Fèvre: “Peleus and Iason were alighted from their hors.” But “alight of” and other variations go back all the way to Old English, the first in the OED being from Ælfric of Eynsham in the late 900s: “Ic geseah þurh Godes gast, þa se þegen alihte of his cræte” (I saw through the Spirit of God that the officer alighted of his chariot).

And here’s the funny thing. Back then, there was a transitive sense of “alight”. It’s been obsolete since the 1600s, but it meant “to make light, or less heavy; to lighten, alleviate (a burden); to relieve (a person) of a burden”.

Here’s an early example, also from Ælfric: “a ealdan cyningas on ðam aerran timan hogodon hu hi mihton heora byrðena alihtan forþan ðe an man ne mæg æghwar beon” (the old kings in previous times thought about how they might alight their burden, for one man cannot be everywhere).

And one from Reginald Pecock in 1449: “for this cause of aliȝting the poor men it is alloweable and profitable, that lordis and ladies haue mansiouns with inne the cloocis gatis”.

These two are figurative uses, but it could be literal as well: if you got off your horse, you would make it lighter; you would relieve it of its burden; you would alight it.

And if there had been an escalator around for you to get off, you would have alighted that too.

If the rule you followed brought you to this, of what use was the rule?

The only thing as bizarre and horrifying as the Trump administration’s loudening belches of vicious, incompetent corruption is the coverage thereof in the New Yorker. Specifically, the punctuation.

You may have seen this headline:

Donald Trump, Jr.,’s Love for Russian Dirt

Period, comma, apostrophe. What a grotesque sight for these three punctuation marks to be colluding so brazenly. Why did this horror happen?

The New Yorker’s Andrew Boynton explains:

The reasoning for the punctuation of “Jr.,’s” is pretty straightforward. It’s a collision of conventions. The first convention is The New Yorker’s—we place a comma before “Jr.” Doing so leads to another of our conventions: when something like “Jr.” occurs in the middle of a phrase, clause, or sentence, it is set off by its preceding comma and a following comma. Thus: “Ed Begley, Jr., was in ‘St. Elsewhere.’” A third convention is one that we all accept: the possessive is indicated by the addition of an apostrophe and “s.” We (the magazine) like our punctuation; we set things off with commas a lot; it drives some people nuts (i.e., it’s “bullshit”). This reaction is not surprising; it is also not new. With “Jr.” occurring in the middle of a line, where else is the possessive indicator supposed to go?

I’m happy to tell them where to stick it.

It’s silly

Boynton’s case all sounds very logical, but likewise it’s very silly. The New Yorker is a magnificent publication, and one of the most carefully copyedited in the world, but some of its style conventions are odd – and applied too zealously.

A house style on punctuation (or any other aspect of language) exists to help the reader get at what the writer is saying. Consistency is a big part of that: if an article’s comma style varies from one paragraph to another, it can be disorienting and distracting.

But English is a magical beast, and the ways we can ride it are myriad. This means that too much consistency can get in the way. Sometimes a mostly reasonable style rule can cause trouble, and the pile-up that is “Jr.,’s” is one of those times. In these cases, a wise copyeditor will break the rule – or find a way to avoid it.

Also, it’s wrong

Oh yes, and neither comma should be there anyway, regardless of the apostrophe. A New Yorker convention this may be, but the pair of them are confusing and misleading. To explain, I’ll rely on the magazine’s wonderful “Comma Queen”, Mary Norris.

She has repeatedly defended the first and second commas in this notorious New Yorker sentence: “Before Atwater died, of brain cancer, in 1991, he expressed regret.” Critics think these two commas needlessly fussy, but Norris says:

I took a good, hard look at the magazine’s policy, and I persuaded myself that in fact these commas were not indiscriminate. They marked off segments of the sentence that were not germane to the meaning. The point of the sentence… is that Atwater expressed regret before he died. What he died of and when he died of it are both extra details that the author, Jane Mayer, provides only to satisfy the reader’s curiosity. They aren’t essential to the meaning of the sentence. They are nonrestrictive.

I don’t wholly agree with this policy, but Norris’s reasoning is clear. The two phrases bracketed by commas – “of brain cancer” and “in 1991” – are inessential asides. You could cut them and the main statement would be intact: “Before Atwater died, he expressed regret.”

With that in mind, let’s look at the Trump headline again:

Donald Trump, Jr.,’s Love for Russian Dirt

Here, “Jr.” is bracketed by the pair of commas. It looks like one of those inessential, nonrestrictive asides. So let’s try cutting it to see if it leaves the main part intact:

Donald Trump’s Love for Russian Dirt

Nope. When there are two Donald Trumps in a story, “Jr.” and “Sr.” are absolutely essential. But the commas suggest otherwise.

Perhaps there is a line in the New Yorker style guide saying, “Oh, but in this sort of case, the commas are working differently.” But, if so, the New Yorker readers don’t know this.

One of the lessons it took me a while to learn as a copyeditor is that you must work to satisfy readers who neither know nor care what your style guide says. If instead you work – however accurately – to correspond to a geekily intricate but internally consistent set of arcane rules, you are failing. You are working to satisfy yourself and your peers.

Those commas, like the Donald Trumps, are a repellent pair that I wish would appear in the news less often.

Paul Romer and the World Bank and conjunctions and brevity and bad targets

The World Bank’s chief economist, Paul Romer, has been demanding that his staff use the word “and” less.

Why? It’s such an innocuous little word.

This is his thinking:

Circulating a draft of the upcoming World Development Report, Mr Romer warned against bank staff trying to pile their own pet projects and messages into the report. The tendency, he argued, had diluted the impact of past reports and led to a proliferation of “ands”.

“Because of this type of pressure to say that our message is ‘this, and this, and this too, and that …’ the word ‘and’ has become the most frequently used word in Bank prose,” he complained in an email.

Romer mentions a study of World Bank writing, which highlights this phrase as an example:

emphasis on quality, responsiveness, and partnerships; on knowledge-sharing and client orientation; and on poverty reduction

This is 16 words (counting “knowledge-sharing” as two), of which three are “and” – 18.75%. That’s very high.

I agree that a lot of corporate writing – like this quoted phrase – tries to say too much, making it clunky, boring and unfocused. But a mechanical focus on the word “and” is the wrong way to fix this.

Imagine you’re a junior member of Romer’s team. As instructed, you’re trying to reduce the “and”s in that phrase. The simplest thing would be to change it to this:

emphasis on quality, responsiveness, partnerships, knowledge-sharing, client orientation and poverty reduction

Now you only have one “and” in 11 words – just 9.1%. A great improvement!

Except that the whole waffly list of things is still there. You may have hit the target but you’ve missed the point.

The point is to say less. So instead, you talk to your colleagues to figure out what’s truly important to keep in that list, and you all agree that it only needs to emphasise the first two points. So you can cut it down to this:

emphasis on quality and responsiveness

This is a great improvement. Shorter and much better-focused.

There’s only one problem: five words, one “and” – you’re up to 20%. Romer will not be happy. But Romer will be wrong.

Word By Word: The Secret Life of Dictionaries – by Kory Stamper

word by wordIt turns out that dictionaries don’t just coalesce out of the antique dust in academic libraries, hardening on the shelves into compendia of immutable, authoritative Truth. Human beings write them.

There are actually real people who pay intense attention to words and their uses, who record and catalogue these uses, who spend hours and days and decades sifting these uses and carefully analysing them – a different shade of meaning here, a grammatical variation there – and who then try to encapsulate these analyses in succinct, helpful definitions, so that chumps like me can one day look them up and announce: “Well, according to The Dictionary…”

Real people do this for a living. One of them is Kory Stamper, who has written a beautiful, fascinating, witty, loving, irreverent book about the life of the lexicographer.

In Word By Word, she tells us about her career at Merriam-Webster and the colleagues who wrestle with pronunciation, etymology and how to socialise with the extroverts from marketing. She tells us about the silent frenzies of defining that rage inside their cubicles, about how lexicography is not just a job but a way of life. She shares the joys of handling correspondence from a polite and well-informed public (hashtag sarcasm), and she talks us through some of the words that have caused the most trouble and how they were eventually brought to book.

How do you handle racial bias in explaining the meaning of “nude”-coloured pantyhose? Where can you look to find out where “posh” really came from? How do you tease apart the many, many senses of “take” without losing your mind? What are and aren’t the essential features of a “surfboard”? What exactly was the problem with an old definition of “bitch”? And how do you convince an angry mob that “irregardless” is a word whether they like it or not?

She explains what dictionaries really do and busts myths about what they don’t do. And she unearths some gems from the history of lexicography, spanning Samuel Johnson, Noah Webster and what happens when rival dictionaries fall out.

And she makes it all such damned good fun.

I’m a copy editor who blogs about language, and I like to think of myself as a word nerd. But I’m not. Kory, who toils endlessly on the linguistic equivalent of the Human Genome Project, is the very definition of a word nerd.

You should look her up.

Full disclosure: I slightly know Kory, and she bought me a beer once. So, for all you know, I’m biased and possibly drunk. Here are some other reviews by proper people: Megan Garber (Atlantic), Stan Carey (Sentence First), Jennifer Schuessler (New York Times), Caitlin PenzeyMoog (AV Club), Stevie Godson (New York Journal of Books).

‘Got’ and ‘gotten’ in British and American English

As every loyal British subject knows, American English is bad and wrong and stupid and a threat to our way of life. So I guess that makes me a traitor. I find it hard to worry about a few new imported words and phrases every year, especially as we exported our entire language across the Atlantic.

Yes, some of these newcomers grate – new words often do – and many of those ones don’t survive long. But others have become so well-established that most Brits don’t even realise they were made in America. Here are a few of the terms we’ve gained from the US: bandwagon, bedrock, blizzard, boss, cereal, close-up, comeback, cocktail, crook (criminal), ditch (get rid of), electrocute, fan (devotee), footwear, gobbledygook, graveyard, joyride, know-how, maverick, radio…*

The US has been a wonderful way of enriching British English, even if some of those riches seemed vulgar at first.

How gotten works

One noteworthy word is gotten: standard in the US but not in the UK.

In both countries, the past tense of get is got. In British English, the past participle is also got. But in American English, it’s more complex. Roughly: when talking about a static situation (possessing or needing) the past participle is got; when talking about a dynamic situation (acquiring or becoming) the past participle is gotten. So:

  • Yesterday I got a new guitar
  • I’ve got a great guitar
  • I’ve gotten a new guitar
  • You’ve got to see my new guitar
  • I got into playing the guitar last year
  • I’d gotten into playing the guitar the previous year

Any Brit who reads American books or watches American TV and films will have come across gotten. And, in fact, more people in Britain are coming to use it themselves – although it’s still often seen as an Americanism.

For a snapshot of recent usage, I looked at the Glowbe corpus of text from 1.8 million web pages in 2012. I compared different countries’ uses of has gotten, have gotten and had gotten with has got, have got and had got. The results don’t account for differences between static and dynamic situations, but they give us a rough relative picture.

On US websites, has/have/had gotten outnumbers has/have/had got by almost two to one. So that’s what a fully operational gotgotten distinction looks like. On Canadian sites, gotten is only slightly ahead of got, which suggests usage may be a bit more mixed. In Australia and Ireland, got is ahead by about three to one; gotten is common, but not fully accepted. And on British sites, has/have/had got outnumbers has/have/had gotten by seven to one.

(In a more formal context – Hansard’s record of proceedings in Parliament since 2010 – the ratio is about 1,500 to one.)

So, gotten is still far from mainstream in the UK, but it has built a firm presence. And, whether or not it catches on to become standard, it’s another example of British English using an Americanism.

Except that it isn’t.

This is how British English used to work – or rather, how English English used to work before Britain even existed.

The English decline of gotten

The huge list of example sentences in the OED suggests that gotten reigned supreme until the late 1500s, when got increasingly appeared in its place. Shakespeare and Hobbes used both. Got seems to have overtaken gotten around 1700.

Geoffrey Chaucer (Legend of Good Women, c1386): Ffor he woste wel she wolde nat ben geten

John Paston (letter, 1477): The Frenshe Kynge hathe gothen many off the townys off the Dukys off Borgoyne

Myles Coverdale (Bible translation, 1535): Treasures that are wickedly gotten, profit nothinge

William Shakespeare (Henry VI pt 2, c1591): Jack Cade hath gotten London Bridge

Shakespeare (Henry VI pt 3, c1591): The Army of the Queene hath got the field

Walter Raleigh (letter, 1618): I had gotten my libertye

Richard Whitlock (Zootomia, 1654): they should have got a whipping

John Evelyn (letter, 1690): I have now gotten me a pair of new horses

George Berkeley (Alciphron, 1732): Some old Ideas may be lost, and some new ones got

John Stepple (testimony at the Old Bailey, 1742): I would go and fetch a Constable, for he had got the Thief

Usage commentators eventually noticed the change, but too late to do anything about it. Robert Lowth’s popular Short Introduction to English Grammar (1762) complained about “a very great Corruption, by which the Form of the Past Time is confounded with that of the Participle” – including the use of got instead of gotten. Lowth said: “This confusion prevails greatly in common discourse, and is too much authorised by the example of some of our best Writers.”

Maybe Lowth was thinking of Samuel Johnson, whose dictionary, seven years earlier, had uncritically listed both got and gotten as options for the past participle of get. Neither Johnson nor Lowth commented on the difference between static and dynamic situations.

And then in 1795, Lindley Murray’s blockbuster English Grammar declared that gotten was “obsolete”. That’s an overstatement, but by then it was uncommon, at least in standard usage. It partly survived in some nonstandard dialects (such as in Scotland and Ireland), as well as in the fossilised phrase ill-gotten gains. And there British English stayed for the best part of two centuries.

The American rebirth of gotten

In the US, got also dominated, but gotten survived on the fringes.

Noah Webster’s dictionary of 1828 said that gotten was “nearly obsolete in common parlance”. But it also said the same of forgotten and swollen. A generation later, Richard Meade Bache’s Vulgarisms and Other Errors of Speech (1869) said that gotten was still only “nearly obsolete”.

But Richard Grant White in Words and Their Uses (1870) saw the issue as a live dispute, and he picked a side: “I am asked, for instance, whether gotten… belongs to the list of ‘words that are not words.’ Certainly not.” Alfred Ayres in The Verbalist (1881) agreed: “If we say eaten, written, striven, forgotten, why not say gotten, where this form of the participle is more euphonious – as it often is – than got?”

The American revival of gotten seems to have started at the end of the 19th century.

Data from Google Books shows the end of gotten’s decline in British and American English (as with the Glowbe data, I’m looking at the ratio of has/have/had gotten to has/have/had got). And then – in the US – there’s the start of its recovery:



Some Americans continued to resist it, such as Dana Jensen (Modern Composition and Rhetoric, 1935), who said, with a whiff of wishful thinking, that “gotten… has been supplanted by got in formal usage”. The mention of formal usage suggests that the rearguard action had narrowed its focus to style, but still it was doomed.

In 1942, Eric Partridge’s Usage and Abusage accepted the return of gotten: it was “obsolete in Great Britain… but in the U.S.A., gotten (past participle) is preferred to got”.

And Theodore Bernstein in The Careful Writer (1965) agreed, noting the value of the gotgotten distinction. He quoted the linguist Albert Marckwardt: “In fact, most Americans regularly make a very precise distinction between got and gotten. ‘We’ve got ten thousand dollars for laboratory equipment,’ means that the funds in question are in our possession – we have them. ‘We have gotten ten thousand dollars for laboratory equipment,’ means that we have obtained or acquired this particular sum of money.”

In the late 20th century, gotten surged back into mainstream, standard usage in the US. And British usage has tentatively started to follow.



The future

I don’t think any British usage guide has yet endorsed gotten, and for the moment that seems fair – although the reason has changed. A century ago, the word would have seemed affectedly archaic (Henry Fowler’s judgement in 1926); today, the risk is that it comes across as affectedly American. How easily we forget our history.

Jeremy Butterfield’s 2015 edition of Fowler notes that gotten is on the increase in the UK. And I’ve been noticing it more and more in British conversation over the last few years – mostly from younger people. So I’d guess this shift is generational rather than because individuals are changing their usage. While it’s easy to pick up new words at any age, the grammar of a common verb like get may be a more fundamental thing to relearn. I’d expect gotten to keep growing – but slowly, and mostly in casual contexts.

And why shouldn’t we Brits use it? As Marckwardt and Bernstein said, and as millions of Americans have found, it’s useful. And it’s a part of our heritage that the US is helping us to recover. But I may be too set in my ways to start using it myself.


* Thanks to various people on Twitter for helping me with the list of Americanisms that have become British – especially to Andrew Brightwell, who pointed me towards a paper by Katerina Pauliuc.