We need to talk about drug-resistant infections

Do you know what “antimicrobial resistance” is?

Most people don’t. That’s one of the findings from a series of focus groups and interviews commissioned by the Wellcome Trust (where I work).

First of all, “antimicrobial”. A lot of people have never come across this word. Antimicrobials are a family of drugs that kill microorganisms. It’s a broad über-category spanning several types of drug – including antibiotics, which pretty much everyone has heard of.

You probably have a rough idea what antibiotics are, even if you don’t know the exact definition: they treat infections that are caused by bacteria. Other types of antimicrobial drug treat infections that are caused by viruses or by fungi.

So, based on that, what’s antimicrobial resistance?

This still flummoxed the people in the focus groups. They hadn’t heard the phrase, but some of them thought they could work out what it meant: it’s when a person who has taken antimicrobial drugs for a while develops a resistance to them.

It sounds logical. It’s also dangerously wrong.

Scientists and health policy makers use “antimicrobial resistance” to mean that the microbes develop a resistance to the drugs.

This is the sort of confusion that happens when you carelessly use an abstract noun like “resistance” – whose resistance to what? It’s especially confusing when you couple it with a scientific word that most people don’t know.

A better term, the research suggested, would be “drug-resistant infections”. This clearly says who is resistant to what.

People need to understand this. Not because science is cool, but because people’s behaviour contributes to the growth of infections that are immune to our best drugs. For instance, as my colleague Kate Arkless Gray says, “if people think that they will develop a resistance to antibiotics, they may be less likely to finish the full course, when in fact not finishing the course could increase the rate at which resistance develops”.

We must talk to people in a language they understand. It can be hard to appreciate how little specialist language really seeps into public awareness, but it’s essential to make the effort to understand your audience.

And that’s why we need to talk about drug-resistant infections.

Synonymous with correct usage

One of the common mistakes I come across when I’m editing involves the word synonymous. You can see it in sentences like this:

Roger Federer is synonymous with great tennis.

Does that seem OK to you? If so, I’m afraid you’ve fallen prey to a confusion that affects many people. But it’s easy to explain.

A synonym is a word having the same meaning as another. It originally comes from the Greek syn (meaning same) and onyma (meaning name). Based on this, the adjective synonymous is defined as “having the same meaning”.

So big is synonymous with large and field is synonymous with meadow – but Roger Federer is not synonymous with great tennis. The man is associated with the game, sure, but the two terms don’t mean the same thing – otherwise people would talk about Serena Williams playing some really Roger Federer. And that’s obvious nonsense!

For centuries, people used synonymous without any difficulty, but recently some have started to extend its use, blurring the true meaning. This misuse is now widespread, but careful writers still avoid it – and so should you. If you use synonymous in this loose modern way, you risk not just confusing your audience but also degrading the word so much that it becomes unusable.

This would impoverish the English language and make it harder for us all to communicate. But it’s not too late to avert this, if we just make the small effort to use synonymous only in its original, logical, correct sense. Continue reading

Fowler’s Dictionary of Modern English Usage (4th edition) – by Jeremy Butterfield

“Is that good English?”

Century after century, people ask this and publishers sell books to answer them. A lot of these books are terrible but some are linguistic treasure troves. A lot of them sink without trace but some endure for decades.

This is one of the good ones.

A brief history of Fowler

One of the biggest names in usage guidance over the last 100 years, especially in Britain, has been Fowler.

Henry Fowler published his Dictionary of Modern English Usage in 1926. It contained a lot more learning, insight and modesty than many of today’s usage guides, but it was fundamentally a book of Fowler’s opinions. These opinions, shaped by his lexicographical work, were sometimes wise, sometimes idiosyncratic. His writing style was sometimes witty, sometimes obscure. It was a hugely influential book, which is why its successors still bear the Fowler name.

In 1965, Ernest Gowers edited the second edition. There were cuts and additions and polishes, but it was essentially an update, and the bulk of the book stayed the same.

The third edition came in 1996, edited by Robert Burchfield. This was a near-total rewrite, although there were still plenty of traces of Fowler. It marked a shift in approach towards greater objectivity and more systematic analysis. Burchfield had built up a database of language use far exceeding Fowler’s more hotchpotch collection, and he used it to inform his judgements. That said, he wasn’t shy with his opinions.

Now, in 2015, we have the fourth edition, edited by Jeremy Butterfield. It’s an update of Burchfield’s version, but a pretty substantial one. There are over 250 new entries and many of the others have been partly or mostly rewritten. Butterfield again improves the standard of evidence by using the immense resources of the Oxford English Corpus – a database holding 2.5 billion words of 21st-century usage from all sorts of sources. And while his own opinions still figure, they play a smaller role than Burchfield’s did. Continue reading

They know not what they do

One of my more exciting hobbies is checking whether prescriptivists follow their own advice. Often, they don’t. For instance, a stunningly high proportion of people who criticise the passive voice use the passive voice themselves, even as they’re criticising it.

I’ve seen four of these in the last week or so (most if not all via Oliver Kamm).

First, in an article on LinkedIn, Bernard Marr writes:

While there’s nothing absolutely wrong about passive voice, it’s considered weak writing.

I’ve underlined Marr’s use of the passive voice. And I don’t think it’s weak – and nor are the other nine uses of the passive in that article.

Second, in a blog post, Nigel Dudley writes:

Organisations use the passive in their statements, particularly when they have been criticised and want to dodge responsibility.

True, the passive voice can be used to avoid mentioning who was responsible for something. But here, there’s nothing evasive about Dudley’s use of it. He’s talking in general, not about any particular instance of criticism. It’s a perfectly good turn of phrase.

He says that we should “doubt the integrity of those who use the passive a lot” – but I disagree, because he’s clearly sincere even though he uses the passive another six times in the post.

Third, and more shockingly, the Economist Style Guide’s entry on “passive” says:

Be direct. Use the active tense. A hit B describes the event more concisely than B was hit by A

Talking about the “active tense” or the “passive tense” is a howler: they’re voices, not tenses. Either can be used in any tense:

  • A is hitting B; B is being hit by A
  • A will hit B; B will be hit by A
  • A had hit B; B had been hit by A

And while this entry doesn’t use the passive itself, the two immediately below it do:

Peer (as a noun) is one of those words beloved of sociologists and eagerly co-opted by journalists who want to make their prose seem more authoritative.

Per capita is the Latin for by heads; it is a term used by lawyers when distributing an inheritance among individuals…

The “per capita” example could have been written in the active with a tiny gain in concision (“…it is a term lawyers use when…”). But the “peer” one would have been awkward and in fact longer (“words beloved of sociologists and which journalists eagerly co-opt when they want…”).

Many other entries on the P page of the Economist guide use the passive – and use it well.

Fourth, and most spectacularly, Toby Young writes in the Spectator:

On the contrary, nearly all of Gove’s rules can be traced to George Orwell’s ‘Politics and the English Language’, an essay that’s generally regarded as the best guide to writing good English that has ever been produced. To give just one example, Orwell’s fourth rule is ‘Never use the passive where you can use the active’. Kamm singles this out for condemnation in his Times article, apparently unaware that it was first set out by the finest prose stylist of the 20th century.

There’s one final reason why it was sensible of Gove to set out these rudimentary principles. I’m absolutely certain that for every Oliver Kamm who bridles whenever these old–fashioned rules are observed, there are 10,000 Toby Youngs who feel almost physically assaulted when they’re ignored.

Part of the beauty of this self-blindness is that Orwell’s essay also used the passive voice extensively, including in his complaint that “the passive voice is wherever possible used in preference to the active”. So Young is truly a disciple of his hero.

(Other professional writers who condemn the passive while using it include Stephen King and Simon Heffer.)

So, what’s the moral of all this?

I am shallow and I can’t deny a certain cheap ‘Gotcha!’ satisfaction in spotting examples like these. But I’m not faulting these writers’ uses of the passive: this rule-breaking prose is mostly well-written, and there’s something to learn from that fact.

Young, in particular, knows how to put sentences and paragraphs together. But he apparently doesn’t know how he does it. He thinks certain grammatical rules make him “feel almost physically assaulted when they’re ignored”. But that passive construction, and the others, didn’t make him feel assaulted. He wrote them, probably read them over once or twice, and thought they were fine. And on that point, at least, he was right.

The passive voice is an essential tool in every good writer’s repertoire. Oliver Kamm, in the Times, gives a superb example of Orwell using it in his essay:

Orwell describes the reality of the anodyne term pacification: “Defenceless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets…” – passive clauses, every one. It’s powerful writing because Orwell ignores his own advice.

I’ve written plenty about the passive voice before. In summary, it can indeed be stuffy, verbose and evasive – but it can also be useful for shifting focus and improving flow.

But today I’m making a point about the psychology of prescriptivism (or what Joseph Williams called “the phenomenology of error”).

When people insist that a certain use of language is bad or wrong but use it themselves, even while doing the insisting, something is amiss. Their beliefs about language have become unmoored from their use of language.

They have heard that the passive voice (or adverbs, or split infinitives, or fused participles, or singular “they”, or “who” as object, or whatever) is bad. They have seen a few examples of it being used to bad effect. This has convinced them that it is bad, and so they’ve started to preach the rule against it themselves.

But they haven’t thought enough about possible good uses of the passive (or whatever). And because they don’t realise that the human mind is far from transparent to itself, it doesn’t occur to them that their proud, firm belief isn’t reflected in their own fluent, natural behaviour. They don’t notice how useful they find the thing that they condemn.

They understand how to use language. That understanding runs deep – deeper than the conscious belief they’ve adopted – and it is what keeps them good writers even as they become bad writing advisers.

There is no surer sign of a bogus rule than that it cannot take root in its own evangelists’ minds.

Michael Gove on “however” and contractions

Michael Gove, the Lord Chancellor and Justice Secretary, has given his civil servants some guidance on writing.

I won’t go over all of it, but a few things stand out.

First, some positives. He urges civil servants to take a “warm tone” and be “gracious in thanking people for their letters”, to avoid being repetitive or “too pompous”, and to acknowledge their correspondents’ views and arguments even if not agreeing with them.

This is good. Official letters are often impersonal, stern and even hostile, so this shove from the top is welcome.

He also tells civil servants to avoid using “this” and “it” on their own, instead wanting them “to write exactly what they are referring to”. This can help – sorry, this advice can help – to make writing easier to understand.

But two things I disagree with. Continue reading

Everything you ever wanted to know about singular “they”

Some people don’t like the singular, gender-neutral use of “they” (along with “them”, “their” etc.):

  • They hung up without saying anything.
  • Who finished their work first?
  • If anyone forgets their hat, you’ll have to leave them out of the photo.
  • Nobody who cares about their future can ignore this.

The objection is:

“They” and its cousins are plural and can’t be singular. The first two of the sentences above are about a single person, so they’re just wrong. The other two may be about several people, but “anyone” and “nobody” are still singular words (we say “anybody is”, not “anybody are”), so they’re still ungrammatical. While the traditional generic “he” can seem odd or sexist, and “he or she” can be clumsy, that doesn’t mean we should break the logical rule that separates singular from plural.

If you’re inclined to agree, I’d like to try to convince you otherwise. It’s fine to dislike singular “they”, but maybe you needn’t worry about it so much.


Pronouns are a mess, but they’re a familiar mess

Here’s a simple, irrefutable proof that a pronoun can be both singular and plural: “you”.

We gave up on the distinction between singular “thou” and plural “you” centuries ago, and it hasn’t done us any harm. This means the so-called logical objection to singular “they” is wrong.

In fact, logic is a poor guide to English pronouns, which are an inconsistent mess:

  • Most have different subject and object forms – “I” and “me”, “he” and “him”, “she” and “her”, “we” and “us”, “they” and “them” – but “you” is both subject and object.
  • Third-person singular pronouns vary by gender – “he” and “she”, “him” and “her” – but other pronouns don’t.
  • Reflexive pronouns are irregular: “myself”, “yourself”, “herself” and “ourselves” are formed from the possessive “my”, “your”, “her” and “our” – but then instead of “hisself” and “theirselves”, we have “himself” and “themselves”.

Our pronouns are deeply illogical, but we don’t notice this because we’re so familiar with them.

And there’s another oddity. “We”, “us” and “our” can be, and often are, singular:

“Each one of us will have our own special triumphs or tragedies to look back on.”

Here, “each one of us” is singular, but the following possessive is “our”. This is natural and clear – and grammatical (spoken by someone with unimpeachable command of the Queen’s English).

We’ve all come across this sort of thing many, many times, but we haven’t been trained to find it illogical, so we don’t. We don’t even notice it. Google “singular we” and you’ll find pretty much nothing; Google “singular they” and you’ll tumble into a vortex of angry pedantry.

This shows that dislike of singular “they” isn’t a natural, logical reaction to a real grammatical mistake; it’s an artificial constraint that takes effort to internalise.

A third-person version of that sentence would be:

“Each one of them will have their own special triumphs or tragedies to look back on.”

Equally natural and clear – and grammatical.

“You” can be singular as well as plural, which everyone accepts. “We” can be singular as well as plural, which no one notices. “They” can also be singular as well as plural, and the only problem is the people who believe it’s a problem.

Let’s meet a couple of them. Continue reading

What style of language do scientists really prefer?

“Our readers are intelligent, well-educated scientists. Why should we make our language dumbed-down, patronising and imprecise in the name of ‘readability’?”

It’s a fair question. Here’s the answer.

Never talk down to your readers. But never waste their time, either. And scientists, while intelligent and educated, are also busy. As well as their research, they may have teaching, management or clinical duties to perform, funding applications to write, presentations to plan, journals to keep up to date with… They don’t have time to wade through verbiage in search of facts.

If you’re writing about something complex, then of course you need to give all the necessary detail. If you’re writing for specialists, you can use their specialist terms. But you don’t need to add verbal complexity beyond that. Keep it clear and direct. This makes your writing more efficient and more likely to succeed in communicating your message. It’s also courteous to your readers.

Einstein may or may not have said, “Everything should be made as simple as possible, but no simpler.” Either way, it’s good writing advice.

But we don’t all follow it.

Many people, when writing in a work context, feel that they’re supposed to use language that is more abstract, impersonal and convoluted than they otherwise would. The idea is that this makes the writing sound more professional. But the result is often that it’s unclear and off-putting, even to highly intelligent readers.

Language like that can be made more concrete, more personal and more concise without dumbing down the content, without losing important information, and without making the tone inappropriately casual. In fact, directness and clarity normally sharpen the tone and can even help to add precision. Clearing up overgrown language can show you previously hidden patches of ambiguity.

Testing the hypothesis

Scientists like evidence, so let’s have some.

John Kirkman, as part of his book Good Style: Writing for Science and Technology, ran several surveys of scientists.

In one, he gave two versions of a passage from a research paper to 2,781 readers from medicine and the life sciences:

Brown’s version

In the first experiment of the series using mice it was discovered that total removal of the adrenal glands effects reduction of aggressiveness and that aggressiveness in adrenalectomized mice is restorable to the level of intact mice by treatment with corticosterone. These results point to the indispensability of the adrenals for the full expression of aggression. Nevertheless, since adrenalectomy is followed by an increase in the release of adrenocorticotrophic hormone (ACTH), and since ACTH has been reported (Brain, 1972) to decrease the aggressiveness of intact mice, it is possible that the effects of adrenalectomy on aggressiveness are a function of the concurrent increased levels of ACTH. However, high levels of ACTH, in addition to causing increases in glucocorticoids (which possibly accounts for the depression of aggression in intact mice by ACTH), also result in decreased androgen levels. In view of the fact that animals with low androgen levels are characterised by decreased aggressiveness the possibility exists that adrenalectomy, rather than affecting aggression directly, has the effect of reducing aggressiveness by producing an ACTH-mediated condition of decreased androgen levels.

Smith’s version

The first experiment in our series with mice showed that total removal of the adrenal glands reduces aggressiveness. Moreover, when treated with corticosterone, mice that had their adrenals taken out became as aggressive as intact animals again. These findings suggest that the adrenals are necessary for animals to show full aggressiveness.

But removal of the adrenals raises the levels of adrenocorticotrophic hormone (ACTH), and Brain2 found that ACTH lowers the aggressiveness of intact mice. Thus the reduction of aggressiveness after this operation might be due to the higher levels of ACTH which accompany it.

However, high levels of ACTH have two effects. First, the levels of glucocorticoids rise, which might account for Brain’s results. Second, the levels of androgen fall. Since animals with low levels of androgen are less aggressive, it is possible that removal of the adrenals reduces aggressiveness only indirectly: by raising the levels of ACTH it causes androgen levels to drop.

Kirkman sums up the main differences:

  • shorter, less complex sentence structures in Smith’s version
  • all necessary specialist terms are present in both versions, but Smith uses familiar words in place of unnecessary specialist terms
  • fewer passive structures and less roundabout phrasing in Smith’s version
  • paragraphing.

The scientists were then asked what they thought of the two versions.

“Which style do you prefer to read when you read scientific texts?” 74% picked Smith and 21% Brown.

“Which style do you think is more appropriate for scientific texts?” 57% picked Smith and 25% Brown.

Clearly the Smith style was much preferred, but the disparity between these two answers is interesting. A significant minority, even though they favoured Smith’s style, were still reluctant to endorse it for professional use. Presumably they were worried that their peers didn’t share their preferences. But, the survey shows, most did (they also rated Smith’s version easier to read and more precise, and rated Smith as being more objective and having the better-organised mind).

This is a case of what Steven Pinker calls pluralistic ignorance:

a false consensus, in which everyone is convinced that everyone believes something, and believes that everyone else believes that they believe it, but in fact no one actually believes it. One example is the cachet that college students place on drinking till they puke. In many surveys it turns out that every student, questioned privately, thinks that binge drinking is a terrible idea, but each is convinced that his peers think it’s cool.

Convoluted prose isn’t so literally sickening, but many academics and journals still maintain the illusion that it’s cool.

Kirkman also ran other surveys, giving different versions of specialist texts to chemical engineers, ecologists and biochemists. The results were similar. He lists the features of the preferred style:

Direct, verbs mainly active, minimum of special vocabulary, judicious use of personal and impersonal constructions, sentences of varied length but mainly short and not complex.

How to do it

For advice on how to improve convoluted writing, you could try Kirkman’s book, which focuses on scientific writing. Joseph William’s book Style: Toward Clarity and Grace is very good and has a broader remit.

If you want something a bit shorter, Rachael Cayley’s Explorations of Style blog has some useful posts on that draw on Williams, among others, with a focus on academic writing. Look in particular at the ‘five key strategies’ posts listed on the right.

Pinker’s The Sense of Style I also recommend, particularly chapters 2–5 (don’t get too bogged down in the sentence diagrams). He too draws on Williams, as well as bringing in some psychological research on how we process language. He’s an engaging writer, if sometimes a bit combative.

I might try to write my own short guide one day. For now, I’ve done a case study [PDF] of a passage of text that isn’t from a journal article but is written for scientists. I’ve tried to improve it – cutting it in half but keeping the information, expressed more clearly and directly – explaining the changes as I go.

What the internet desperately needs is another blog post about the Oxford comma

The Oxford comma – also known as the Harvard comma, the serial comma, and the what the hell is wrong with you people why can’t you just get a life – provokes strong opinions.

It’s the difference between these two sentences:

1a) I ordered bacon, eggs and beans.
1b) I ordered bacon, eggs, and beans.

The Oxford comma is the last one, before the “and” in version 1b.

Should it be there?

Some people say no (loudly): it looks fussy and slows the sentence down. The “and” is quite enough to separate the last two items in the list.

Other people say yes (even more loudly): it’s helpful for clarity. Well, maybe not in this case, but it’s more important when the individual items in a list are grammatically more complex, especially if they contain “and”s. Compare:

2a) They ordered bacon and beans, chips and eggs and toast.
2b) They ordered bacon and beans, chips, and eggs and toast.
2c) They ordered bacon and beans, chips and eggs, and toast.

Sentence 2a, without the Oxford comma, is ambiguous about which ingredients make up which meals: it could mean either 2b or 2c.


3a) The hallway contains a chest of drawers, a mirror, a sculpture made of copper wire and a hatstand.
3b) The hallway contains a chest of drawers, a mirror, a sculpture made of copper wire, and a hatstand.

While 3a may take a moment to decipher – is the hatstand part of the sculpture? – 3b makes it clearer.

The Oxford comma can also help in cases like these well-known examples:

4a) We invited the strippers, JFK and Stalin.
4b) We invited the strippers, JFK, and Stalin.

5a) This book is dedicated to my parents, Ayn Rand and God.
5b) This book is dedicated to my parents, Ayn Rand, and God.

Here the problem is confusion between different types of comma. In 4a, is the comma after “strippers” there to keep items separate in the list or to introduce the extra information (an appositive phrase) that JFK and Stalin are the strippers?

Likewise, 5a might suggest that Ayn Rand and God are my parents. But 4b and 5b make the separation clear.

Now, sure, there’s no real danger of misunderstanding in these two cases; rather, the risk is of a brief sense of absurdity. There are more sensible examples, though:

6a) I asked my neighbours, an architect and a builder.
6b) I asked my neighbours, an architect, and a builder.

So the Oxford comma can be useful. But given that it hardly seems necessary in “I ordered bacon, eggs, and beans”, should we use it all the time?

The AP Stylebook says no, recommending it only in more complex or potentially ambiguous cases: “do not put a comma before the conjunction in a simple series: The flag was red, white and blue. He would nominate Tom, Dick or Harry.”

But the Oxford Guide to Style says yes: “Given that the final comma is sometimes necessary to prevent ambiguity, it is logical to impose it uniformly”. The Chicago Manual of Style also favours using it consistently, as do Bryan Garner, Steven Pinker and others. (It’s more common in American writing than in British.)

I agree that consistency is good – other things being equal. But other things are not equal. There are disadvantages to using the Oxford comma.

It can slow a sentence down. This is obviously subjective and depends on what you’re used to, but I find “I ordered bacon, eggs, and beans” pretty ponderous and deliberate.

The New Yorker’s Mary Norris has taste that goes the other way, favouring the Oxford comma everywhere: “I’ve gotten used to the way it looks. It gives starch to the prose, and can be very effective.” Well, if starch is what you want…

She adds:

The New Yorker isn’t asking you to pause and gasp for breath at every comma. … The commas are marking a thoughtful subordination of information. … It looks choppy, but you don’t have to chop it up when you read it.

Sure, this is thoughtful (it’s in a great extract from what looks like being a great book by Norris), but it makes two huge assumptions.

It assumes that readers consciously choose how commas come across to them. And it assumes that readers will understand the publication’s policy on commas and the reasons behind it. Even for the New Yorker, that feels like a stretch.

As a copyeditor, I’m a big fan of the fine distinctions copyeditors fret over. But we have to have some perspective about whether our readers understand those distinctions the same way we do – or even at all. Sometimes we might be zealously and ingeniously splitting hairs that are invisible to the untrained eye.

There’s a second, more serious problem with the Oxford comma: sometimes it creates the very ambiguity or absurdity that it’s supposed to remove. I’m amazed that its partisans so rarely acknowledge this, because you only need to tweak their examples slightly to see it:

7a) We invited the stripper, JFK and Stalin.
7b) We invited the stripper, JFK, and Stalin.

8a) This book is dedicated to my mother, Ayn Rand and God.
8b) This book is dedicated to my mother, Ayn Rand, and God.

Here the tables are turned: 7b might suggest that JFK was the stripper, 8b that Rand was my mother. 7a and 8a are clear.

So we have to choose whether to use an Oxford comma or not in each case. A blanket policy, pro or anti, just won’t work.

And it gets worse. Try this pair:

9a) I spoke to my uncle, a magistrate and a priest.
9b) I spoke to my uncle, a magistrate, and a priest.

Neither is clear. Does 9a mention three people or one? Does 9b mention three people or two? We need to rephrase somehow:

9c) I spoke to my uncle, who is a magistrate, and a priest.
9d) I spoke to a priest and my uncle, a magistrate.
9e) I spoke to my uncle, who is a magistrate and a priest.
9f) I spoke to a magistrate, a priest and my uncle.
9g) I spoke to my uncle as well as a magistrate and a priest.

We have to face the awful truth: the Oxford comma is not a magical blade that can chop any sentence into slices of perfect meaning. It’s just one fallible tool among many.

Use it when you must, avoid it when you must, choose as you prefer (or as your readers will prefer) when you can, and rewrite whenever that would be better.

Oh, and try not to get too worked up about it.

Accidence Will Happen – by Oliver Kamm

I find it hard to review books that I like, because I tend to feel that I need to be critical to be useful. (It’s a sickness.) And in the case of Oliver Kamm’s Accidence Will Happen: The non-pedantic guide to English usage, I’m also biased: he quotes my blog and thanks me in the acknowledgements for reviewing an early draft.

For what it’s worth, I really do like the book. Kamm’s view, in brief, is that standards are set by general usage and not by rule-mongers who think they know better, and obeying those sticklers is liable to bog you and your writing down with needless superstitions. He also says that Standard English is important but not intrinsically better or more correct than other dialects. Half the book addresses typical stickler arguments and half gives guidance on specific points.

Rather than review the book, I’m going to respond to a critical review of it.

Roger Lewis in the Times (paywall) says:

Any educated person used to know the difference between appraise and apprise, credence and credibility, enormity and magnitude. Those of us who were Mixed Infants in the Sixties would never say bored of, as the correct form is bored with, nor confuse different to with different from. We were aware by the age of seven that you’d be hanged not hung for capital crimes.

Kamm thinks a lot of this is bunkum.

Such rules are “just preferences”, and language should be left to its own devices — which is like saying that the destruction of the countryside ought to be allowed to proceed unchecked, as to halt such developments is to try to prevent an evolutionary inevitability.

Lewis’s timing is badly off. The roads and buildings he opposes were put there long ago; his position, in effect, is that map-makers should refuse to include them.

On matters of historical fact it’s wise to check the evidence. All of Lewis’s supposed errors are old, and their status as errors is mostly debatable.

To Lewis and his fellow pedants, “appraise” means evaluate and “apprise” means inform, and woe betide any modern illiterate who mixes them up; any educated person used to know the difference. Except that the earliest uses of “apprise” in the Oxford English Dictionary – from 1400 – meant to evaluate. It only started to be used for informing in 1694. Only a dozen years later, “appraise” (which had been about evaluating since 1424) started being used for informing. So the two words have a double history. That said, nowadays the disparaged usages are not that common, and Kamm agrees that we should avoid them. But this is on grounds of usage, not some mythical ancient decree.

“Credence”, the pedants say, means acceptance as true; it cannot mean “credibility”. The OED, though, lists a definition of “credence” as “the quality of being believed or believable; the likelihood of being true; plausibility, credibility”. This dates back to 1450. So this is one of many words with more than one meaning, and Kamm says confusion is in practice unlikely. The same is true of Lewis’s next bogeyman…

“Enormity”, according to the pedants, means great wickedness and not great size. The approved usage dates back to 1477 in the OED, but also around that time the word was used to mean an abnormality or a divergence from the normal standard, with no moral judgement necessarily involved. This makes sense when you think about the “norm” part of the word. So, while the approved usage means against the norm of morality, from 1792 the word was also being used to mean exceeding the norm in size. (Also, as Kamm notes, the earliest uses of “enormous” and “enormousness” had nothing to do with size but meant transgressing moral norms.)

“Bored of” is allegedly wrong while “bored with” and “bored by” are right. It’s true that usage has historically favoured the latter two, but this is changing. “Bored of” is still less common in print, but it’s growing. In speech and online, it’s clearly standard. The Glowbe corpus of web usage find “bored of” to be more common on UK-based sites than “bored by” or “bored with”. And there’s no logic to the complaint anyway, says Kamm: we all accept “tired of” and “wearied of”.

“Different to” and “different from” I’ve dealt with before. Both are well-established and both are fine, as Kamm agrees.

I would actually go along with a couple of Lewis’s points – but not because of any immutable rules from some vague point in the past. I try to favour or avoid certain usages based on today’s usage and attitudes (which may depend on the audience). When he says “What activates my own pedantry isn’t futile nostalgia for an ideal classical epoch but a real fear of reverting to linguistic barbarism,” he is kidding himself as well as being hyperbolic. Abolishing sticklers’ pseudo-rules wouldn’t just leave English in excellent shape; it would leave English pretty much as it already is.

One other complaint Lewis makes is:

Kamm can be the revolutionary favouring diversity while flaunting (not flouting) a professional knowledge of dangling modifiers – he can do all this because he operates from a superior base of intellectual assurance. … He can say “errors” represent “flux” and that standards are not falling, because he knows in the first place these various conventions of grammar when he chooses to break them.

Similar points come up in John Rentoul’s and Simon Heffer’s recent discussions with Kamm.

I’m sceptical. Yes, it may be a bit tricky for some people to get their heads round the various distinctions that masquerade as matters of right and wrong. But I think most people can grasp that, for instance, the wrongness of “I ain’t done nothing” depends on context while the wrongness of “I anything not have doen” doesn’t.

It’s a common riposte to people like Kamm that they write in flawless Standard English while arguing that other forms of English are fine too. But Kamm makes it crystal clear that Standard English is vital to know; he just wants it taught without the disparagement of other dialects. He also wants it taught without time being wasted on the loose collection of pseudo-rules and superstitions that are followed by a minority of Standard English speakers.

On this point he practises what he preaches.

In the first few pages of the introduction to his book, he breaks several of these pseudo-rules: he ends sentences with prepositions, he uses “like” (not “such as”) to introduce an example, he treats “none” as plural, he uses “if” where sticklers would demand “whether” and he begins sentences with conjunctions.

The result is good prose. Those rules define not Standard English but a minority taste that imagines itself to be law.

An interesting comparison is Heffer’s book Simply English, which is uncompromising (albeit ill-informed and quirky) in insisting on such rules. Take this line from his introduction:

I am bemused that we should be asked to tolerate someone saying ‘he has flaunted all the rules’ when but for a moment of ignorance they could just as easily say ‘he has flouted all the rules’.

“Someone saying” is a fused participle, a construction that later in the book Heffer condemns as clumsy and confusing. Following his own rule, he should have written “someone’s saying”.

Also in that sentence is a singular “they”, which he elsewhere calls “unacceptable”. Following his own rule, he should have written “he” (because “the male should be taken to include the female”).

Heffer flouts the very rules he flaunts; they take pride of place in his opinions but have a weaker hold over his actual usage. This is not a sign of his illiteracy, though: the sentence is good Standard English. It’s a sign that these rules have been artificially grafted on to the language and don’t really belong there.

If even the most determined professional stickler can’t keep usages like these out of his carefully written book on correct usage, it’s a sign that those usages are correct and the rules against them are bogus.

The exorcism of bogus rules is the purpose of Kamm’s book, and I commend it.

A ridiculously brief, outrageously selective and painfully simplified history of Standard English

Standard English is one of many dialects of English. It’s the dialect that public affairs, the media and administration overwhelmingly use, and is the one most associated with education, prestige and power. Here I’m looking at Standard British English, but many of the points apply in other English-speaking countries.

Standard English dominates public life but not private conversation: only a minority of English speakers (largely defined by class) use it with friends and family, although far more switch into it when the occasion demands. It’s important to be able to do this, because Standard English opens so many doors in life.

Because of its status, many people think of it as ‘proper’ or ‘correct’ English, and scorn those who speak other varieties. But as well as being rude, this is mistaken. If you say ‘I seen them books but I didn’t buy none’ in the company of other people who talk that kind of dialect, that’s fine. If you say it in a job interview at the BBC, you may have a problem. Your error is one of judgement, though, not grammar.

Non-standard dialects are perfectly ‘correct’ – on their own terms. In fact, they’re mostly the same as Standard, but of course what we notice are the differences. And where they differ, they do so according to their own rules – not a lax application of Standard rules, as snobs like to imagine.

For example:

  • Standard uses the possessive ‘my’ and ‘your’ to form the reflexive pronouns ‘myself’ and ‘yourself’, but not the possessive ‘his’ and their’ to form ‘hisself’ and ‘theirselves’. Other dialects have a regular system that does use the possessives consistently this way.
  • Standard doesn’t allow multiple negation – ‘I didn’t want none’ – but most other dialects do (along with some other languages, but je ne sais pas how many).
  • Some dialects make much more use of flat adverbs (without the ‘-ly’) than Standard, such as ‘they talked real slow’.
  • Standard has varied verb endings – ‘you like’ but ‘she likes’ etc. – while other dialects may use the same form, whether ‘likes’ or ‘like’, across the board.
  • Some dialects use singular noun forms for plurals after a number – ‘they walked five mile’. Standard doesn’t, although it does something similar in constructions like ‘a five-mile walk’.
  • Standard has lost the old distinction between singular ‘thou’ and plural ‘you’, while some dialects maintain it. Others have developed new distinctions, using ‘you’ for singular but ‘youse’ or similar for plural.
  • Irregular verbs may differ in the past and perfect tenses. Standard says ‘I spoke’ but ‘I have spoken’ while Tyneside English says ‘I spoke’ and ‘I have spoke’. On the other hand, Standard says ‘I got’ and ‘I have got’ while Tyneside says ‘I got’ but ‘I have getten’ (a relative of the old ‘gotten’, which still thrives in the US).

So in some cases Standard English draws more distinctions than other dialects, in other cases fewer. Some of its conventions are more consistent and some less consistent.

It also isn’t the same thing as formal language. Standard English can range from ‘The consequences of further inaction would be somewhat vexing’ to ‘You’d better get your skates on or I’ll be pretty pissed off’. Standard can be casual, idiomatic and obscene. That said, the situations in which it’s used are more likely to be formal.

The linguists’ label ‘Standard’ reflects status, not quality – and that status is the result of historical accident. So let’s race through a millennium or so… Continue reading


Get every new post delivered to your Inbox.

Join 4,429 other followers