What words “should” mean and what they actually mean

Steve Finan of the Sunday Post has had a rant about “decimate”. He insists on the “reduce by 10%” meaning.

It’s a word I struggle to care about, so rather than get into the details I’ll point you towards good posts by Jan Freeman and Ben Zimmer.

And here are a few historical facts; you can decide for yourself which, if any, we should care about when deciding how to use the word today:

  • 1600: First recorded use in English of “decimate”, meaning to punish a unit of Roman soldiers by killing 10% of them
  • 1663: First recorded use of “decimate” to mean damage, destroy or remove a large part of – not necessarily 10%
  • 1868: First recorded complaint about the non-10% use of “decimate”

What interests me more than this word is Finan’s beautiful display of the I-know-best attitude to language:

Before anyone points out: “But the dictionary says…” the dictionary is wrong. Dictionaries are great on how words are spelled; they are unreliable on what a word should mean because they insist on giving definitions based on what the public is actually saying rather than what it should be saying.

This is like complaining that maps are unreliable because they insist on showing the actual geography of towns instead of the way the streets should have been laid out. Or that instruction manuals are unreliable because they insist on telling you how devices actually work instead of how they should have been designed.

Dictionaries tell you what words mean. Words are not abstract units of pure meaning but tools that a community uses to communicate. If you want to communicate, you need to use them the way the community does, not the way you think they ought to be used. This isn’t about being correct or incorrect: it’s about success or failure. Insisting on a notion of correctness that others don’t share is a sure route to failure.

And the ways people use words change.

Finan protests that he’s totally OK with language change in principle:

I enjoy seeing the language evolve. I’m no stick-in-the-mud. Awful used to mean full of awe; nice used to mean stupid.

But he tries to explain a logical, objective criterion for which kind of language change is good and which is wrong:

But there is no good reason to change the meaning of decimated. We already have perfectly adequate words to describe damage. If we accept that decimated no longer means reduce by one in 10, we are left without a word for being reduced by a tenth. English will have become less expressive as a means of communication merely because ignorant people have been copying one another.

I have a few problems with this.

Firstly, the awkward, pedantic point (sorry, sorry) that nobody is proposing to change the meaning of “decimated”. The change happened 350 years ago. I suppose Finan could say that he is proposing change by wanting to scrap a long-established meaning, but that wouldn’t quite fit his argument.

Secondly, he thinks words shouldn’t acquire new meanings when we already have other words with those meanings. But when “awful” and “nice” changed (both during the 18th century), their new meanings were already well-covered. Finan ought to damn these new-fangled mistakes, but instead he approves. I suspect that in his heart he doesn’t really think the existence of synonyms is “wasteful and pointless”, as he describes the common usage of “decimate”. I suspect that his argument is just a way to rationalise a pet peeve.

Thirdly, if having a word with a certain meaning is really so useful to people, they’ll keep that meaning alive. But most people don’t go around reducing things by a tenth all that often. That sense of “decimate” has largely fallen into disuse. But in any case, we don’t desperately need a single word when we have several perfectly clear phrases that mean reduce by a tenth, such as “reduce by a tenth”.

If you want to talk about reducing things by a tenth, saying “decimate” may not be effective. So many people understand it the other way that you’re likely to be misunderstood. You can harrumph all you like about being right, but you’ll be wrong. Judged against the only yardstick that matters – making your meaning clear – you’ll fail.

As I said, “decimate” isn’t that interesting. But in Finan’s eyes, the problem isn’t just about losing this one word. The whole language is at risk!

Where shall we end up if we accept the errors people make as new meanings? Soon, everything will mean anything and nothing will also mean anything.

Oh, please.

After centuries, even millennia of change there’s no prospect of everything meaning anything and nothing meaning anything. And there never will be. Most linguistic innovations meet with incomprehension and die a quick death. Some recur but stay on the fringes of the language. A few, if people find them useful, get picked up and become mainstream.

The discipline of market forces, not the edicts of a self-appointed elite, will make sure English remains as expressive as we need it to be.

This history of language is a slow journey with no destination. What matters is that we travel together.

Grammar haiku win / Win win win win win win win / Did I mention win?

After four glorious years of failure, I’ve won the annual Grammar Day Tweeted Haiku Contest!

Organised by the American Copy Editors Society, the contest is pretty self-explanatory: you tweet a haiku about grammar, and then you tweet half a dozen more, and then maybe one of them turns out to be kind of OK. And apparently it did:

I also took fifth place with this one:

So I guess that means that on average I’m third?

Anyway, I’m thrilled to bits. My thanks to ACES (especially Mark Allen) and the judges (Adriana Cloud, Corrie Loeffler, Laura Poole, Carol Saller and Karen Yin), and my awe to the other entrants (as always, there are some truly brilliant little works of beauty and genius).

These are my efforts from previous years:

The meanings of “refute”

Do you get annoyed by the modern use of the word “refute”? Lots of people use it to mean “deny” or “reject” instead of “disprove” – as in “I refute these allegations”, which hapless miscreants mumble on the news every day as their reputations collapse.


“You’re just contradicting me!” “No I’m not.”

It makes me a bit twitchy when I hear it. But it’s not that modern. This year is the centenary of the first recorded complaint about “refute”.

In 1916, Robert Palfrey Utter published Every-Day Words and Their Uses. It included this:

To refute a statement, opinion, accusation, imputation, or charge, is not merely to call it in question, or deny it without proof, but to disprove it, overthrow it by argument, show it to be false.

The usage Utter condemned had been becoming more popular by the turn of the 20th century. The earliest example recorded in the Oxford English Dictionary is from 1886:

Mind, i ain’t a snob; I utterly refute that idear. I don’t judge bi the koat he wares, or the joolery, or nothing of that kind

This is a satire of ill-educated language, so maybe “refute” was a deliberate mistake. But even so, that would suggest it was a known misuse.

In any case there are earlier examples. The Guardian’s Peter Bradshaw fears he may have spotted one from 1815, another century before Utter:

But now I read in Jane Austen’s Emma about the heroine’s apparent pique at the existence of a rival beauty: “Mr Knightley had once told her it was because she saw in her the really accomplished young woman, which she wanted to be thought herself, and though the accusation had been eagerly refuted at the time, there were moments of self-examination in which her conscience could not quite acquit her.”

Oh no. Could it be that Austen is a non-refutist? I prefer to think she is deliberately reflecting Emma’s thought processes: in her nervous vexation, Emma in her own mind fudges the meaning of “refute”.

Bradshaw’s thinking sounds wishful.

I found an irrefutable example, after rummaging through a few classics, from Anthony Trollope’s The Warden in 1855:

‘But you do love him,’ said Mary, who had followed her friend to the window, and now spoke with her arms close wound round the other’s waist. ‘You do love him with all your heart—you know you do; I defy you to deny it.’

‘I—’ commenced Eleanor, turning sharply round to refute the charge; but the intended falsehood stuck in her throat, and never came to utterance. She could not deny her love…

So the “deny/reject” meaning has a long pedigree.

And while Johnson’s dictionary of 1755 defines “refute” only as “prove false or erroneous”, the OED lists other, older meanings.

It first records the “disprove” meaning in 1533. Another definition, “to demonstrate error”, goes back to 1572, although the OED says this usage has now become rare. The earliest known meaning was “to refuse or reject (a thing or person)”. This was a Scottish term that the OED now says is obsolete; its first example is from 1513 and its last 1686.

These are intriguing variations on what kind of opposition counts as refutation – but all the same, the “disprove” meaning was clearly dominant by 1700. In the 1800s, though, the “deny/reject” usage – not a million miles from the original Scottish meaning – started to creep in.

And eventually Robert Palfrey Utter noticed it. Like-minded critics have taken up his cause ever since, fighting a hundred years’ war and suffering defeat after defeat as the usage has gained more and more ground.

Take the Guardian. Its style guide stands firm against the not-so-new meaning (“this much abused word should be used only when an argument is disproved; otherwise contest, deny, rebut”) but its articles often don’t. I looked at 20 of its recent uses of the word: 5 of them meant “disprove”, 14 meant “deny” or “reject” (8 in quotes, 6 in the journalists’ own words), and one I couldn’t tell.

I found a similar picture – banned, but used anyway – in the Telegraph.

The growing use of the “deny/reject” meaning has bred growing acceptance. Almost all contemporary dictionaries list it, although many give a usage note like this from the OED:

Criticized as erroneous in usage guides in the 20th cent. In many instances it is unclear whether there is an implication of argument accompanying the assertion that something is baseless

The American Heritage Dictionary suggests that opposition to this meaning is fading: in 2002, 62% of its usage panel (an assortment of literati) approved of this example: “In the press conference, the senator categorically refuted the charges of malfeasance but declined to go into details.”

Among usage commentators, Bryan Garner and Steven Pinker oppose this; those accepting it include Pam Peters and Oliver Kamm. Jeremy Butterfield even-handedly says: “it will sound normal to those who normally use it in this way, and aberrant to those who do not”.

My advice: Don’t waste time fretting about which meaning is “correct”; just focus on what will work for your audience.

For people with academic backgrounds – especially in philosophy or law – you can assume they’ll be happy with the “disprove” sense and they may dislike the “deny/reject” sense. For the public in general, I think the scales have tipped the other way: “deny/reject” is how they most often hear it used, which means the “disprove” meaning may not come across when intended.

On the whole, it’s safest to avoid the word. Unless you’re confident you’ve got an audience who will understand and a context that can point them in the right direction, avoid it.

But “irrefutable” is still safe to use, because its meaning is pretty close to “undeniable”.

Rabid reaction

Michael Oman-Reagan has been looking for sexism in Oxford Dictionaries entries. Specifically, he thinks that their example sentences perpetuate stereotypes.

He lists six entries that trouble him. I have no idea whether these are cherry-picked or representative, but take a look and see what you think. Carolyn Cox has further discussion.

The most interesting of Oman-Reagan’s six is Oxford’s entry on “rabid” and the example it gives: “a rabid feminist”.

He complained to Oxford on Twitter. They replied, saying that “’rabid’ isn’t always negative”.

Is this true?

Their full definition says:

Having or proceeding from an extreme or fanatical support of or belief in something: ‘a rabid feminist’

“Extreme” and “fanatical” do kind of sound bad.

They give three other examples of the word’s use: Continue reading

Giving grammar advice? Whatever you do, don’t check it

Jim Baumann writes a column in the Chicago Daily Herald under the title Grammar Moses, in which he dispenses (mostly bad) advice on grammar and usage.

This week, Moses has crowdsourced his tablets.

One of his contributors writes that we should use ’til and not till as the short form of until. Till, he says, “can be a noun, meaning a cash drawer, or a rather inexact verb describing what growers do to the soil so as to produce crops or decorative plants”.


Any dictionary you bother to check will confirm that till is a legitimate word in its own right. It’s not a short version of until. Things are the other way round: till dates back to the ninth century, until only to the twelfth, when it was derived from till (compare the relationship between unto and to).

As for ’til, Merriam-Webster says that it is “a variant spelling of till used by writers who do not know that till is a complete, unabbreviated word in its own right”.

Baumann’s correspondent offers more advice.

We should say “the expressed written permission” and not, as is common, “the express written permission”. This is why:

One needs to use an adjectival form, and ‘expressed’ is it.

In contrast, while there is certainly a verb ‘express’ meaning to ‘render one’s thoughts,’ such does not fit the bill herein. Likewise, the adjective ‘express,’ as in reference to a fast train that doesn’t stop at all stations, does not fit, either.


Any dictionary you bother to check will tell you that express can mean “explicit and clearly stated”, which is what it means here. Expressed would be quite the wrong word: writing is a form of expression, so anything that is written is by definition expressed.

This use of express is venerable. The OED cites Chaucer around 1386: “Wher can ye seen… That highe God defended mariage By expresse word?”

A final contribution comes from Baumann’s colleague Jim Slusher, a tormented fellow who beats himself up for using sentence adverbs. Amazingly, he doesn’t realise that they’re a standard and well-established way of expressing the writer’s view of a statement, such as: “Just as importantly, they should care about their constituents.”

Sadly, he thinks: “In that construction, the adverb ‘importantly’ describes how they should care about their constituents. I’m not sure how one cares importantly. But that’s what it means.” The poor chap.

Disappointingly, this time I only have four dictionaries to confirm the sentence-adverbial use of importantly (meaning “it is important that”). The other two say only that it’s the adverb form of important.

(The dispute about hopefully is a different matter: you can’t paraphrase it as “it is hopeful that”, which discombobulates some people. But even they are generally content with other sentence adverbs.)

This tripe is what passes for expert commentary on grammar. Hazy ideas, tendentiously rationalised, boldly stated, never checked.

And it’s often so, so easy to check these things. But if you do, you may find that you don’t have anything left to complain about.

I’m a people person

When should you use “people” and when should you use “persons”?

Short answer: you should almost always use “people”.

A controversy sprang up during the 19th century, according to Merriam-Webster’s Dictionary of English Usage, when a few people – or persons – started insisting on a difference. “People”, they said, was for talking about a population or humanity in general, but when talking about a specific group you should use “persons”.

The exact nature of the imagined rule varied from commentator to commentator: some said “persons” should be used after specific numbers (“23 persons”), others said it should also be used after approximate round numbers (“more than a thousand persons”), others still said it should also be used after modifiers like “many” and “several”.

This idea, in one form or another, made it into various style books and writing guides (mostly American) up to around the 1980s, but it is now almost dead and you can safely ignore it – as Chaucer, Shakespeare, Defoe, Dickens, Disraeli, Wells, Hemingway and others ignored it.

The people are rising up

Looking at the Google Books database, I found that “two people” has risen in popularity, overtaking “two persons” around 1950. “Three people”, “four people” and “five people” took the lead over their rival forms around 1975, “ten people” around 1970, “hundred people” around 1910 and “thousand people” around 1890. “Some people” and “many people” went ahead by 1880, “several people” around 1970.

The Victorian commentators clearly noticed a tendency for “persons” to be used more often than “people” in certain contexts. So, gripped by the delusion that a word with more than one use is an abomination unto God, they decided that the tendency should become law.

It didn’t. The opposite happened.

The Corpus of Contemporary American English, covering usage from 1990 to 2012, has “two people” ahead of “two persons” by a ratio of more than 30 to 1.

Trying to prove a point

The only argument I’ve found for this distinction (as opposed to people saying “It just isn’t done that way!” when manifestly it is) comes from William Strunk in 1918:

The word people is not to be used with words of number, in place of persons. If of “six people” five went away, how many “people” would be left?

If the sense of absurdity he’s trying to create isn’t coming across strongly enough, you can look at later editions, updated by EB White, where this line appears as:

The word people is best not used with words of number, in place of persons. If of “six people” five went away, how many people would be left? Answer: one people.

Well, golly gosh. That clinches it. But in fact Strunk and White have undersold their case. This is really an argument against all irregular plurals. Look:

If of “six children” five went away, how many children would be left? Answer: one children.

Absurd! And to be honest, really it’s an argument against plurals of any kind:

If of “six cats” five went away, how many cats would be left? Answer: one cats.

Absurd! Just not in the way Strunk and White think. Their procrustean tomfoolery proves nothing.

Modern advice

I can’t find a single contemporary dictionary that maintains the rule. The nearest is this usage note from Oxford Dictionaries:

The words people and persons can both be used as the plural of person, but they have slightly different connotations. People is by far the commoner of the two words and is used in most ordinary contexts: a group of people; there were only about ten people; several thousand people have been rehoused. Persons, on the other hand, tends now to be restricted to official or formal contexts, as in this vehicle is authorized to carry twenty persons; no persons admitted without a pass.

But even in those official contexts, “people” would work fine.

Garner’s Modern American Usage (2009) says that “twelve persons on the jury seems stuffy to many readers, and most native speakers of [American English] would say twelve people on the jury. In contexts like that, people has long been used and is the more natural phrasing.”

Jeremy Butterfield’s new edition of Fowler’s Modern English Usage (2015) says that “persons, whether preceded by a numeral or not, is tending… to yield to people, and to retreat into somewhat restricted, mostly (semi-)legalistic use”. He means phrases such as “committed by person or persons unknown” and “hidden on their persons”. There, “people” would be odd.

So, if you’re writing a police report (update: or in certain other legal contexts), you may want to use “persons”. Likewise if you’re writing about grammar (first, second and third persons) or Christian theology (the three persons of the Trinity). And likewise if you’d like to sound unusually formal or old-fashioned.

Otherwise: power to the people.

The Stroppy Editor Word of the Year 2015

It’s time to announce the Stroppy Editor Word of the Year 2015!

Yes, even though it’s still only November, we think it’s somehow acceptable to take a complete overview of the entire year. In doing so, we follow the precedent set long ago by the former organisers of the Bethlehem Most Angelic Baby Contest, whose timing they defended until their dying day and indeed thereafter, in an unspecified circle of Hell.

2015 has been a fantastic year for words. “Happy”, “outside”, “geneticist” and “perturb” are just a few of the words that have seen use this year.

Our Word of the Year judges weighed the merits of a diverse shortlist of highly impressive words, but were ultimately swayed – descriptivists that they are – by usage. So, after much debate, they decided that the winner is… “the”!

Extensive research showed that “the” was the most popularly used English word this year. Again.

This is the 485th consecutive time that “the” has won this prestigious trophy. We congratulate it on its victory and remind other words that nominations for 2016 are now open.

About “the”

  • “The” is best known as the definite article, although it has recently taken to hanging out with fancy-pants university types and calling itself a determiner.
  • “The” is also often used as an adverb, in phrases such as “the sooner the better” and “none the wiser”.
  • “The” can be pronounced with two different vowel sounds. People usually say it with an “uh” if the next sound is a consonant and an “ee” if the next sound is a vowel – or if they’re questioning the identity of a famous person a friend claims to have met.
  • While English only has one “the”, other languages – less efficient languages – have several words. German, for instance, has six words for “the”. It also, no doubt, has a word for “the torment of schoolchildren as they struggle to learn loads of pointless versions of the same goddamn word”. Or maybe it’s like Eskimos and snow; maybe there’s just so much more definiteness in Germany that they’ve learned to distinguish six different types of it.
  • “The” is an actual fucking word. Yes, I’m looking at you, so-called Oxford so-called Dictionaries.

About the Stroppy Editor Word of the Year competition

The Stroppy Editor Word of the Year competition has been running since 1498, although it was put on hold during the Preposition Wars of the late 1700s, and it was cancelled on grounds of sensitivity after the Infinitive Incident of 1903.

About the judges

The panel of judges included:

  • Lynne Fuss, apostrophe fanatic and greengrocer-botherer
  • Devilled Gwynne, over-seasoned amateur grammarian
  • Joffrey Pullum, linguist and tyrannical Prince of Westeros

Licence or license? Practice or practise?

If you’re American, it’s simple: use license and practice and you won’t go wrong.

If you’re British, things are more complicated. Use licence and practice as nouns and license and practise as verbs. The adjectives formed from the verbs also have the s spelling.

Hence this passage from the UK’s General Medical Council:

If you want to continue to hold a licence to practise, then you will need to revalidate like every other doctor who is licensed. However, you may not need a licence to practise if you don’t carry out any clinical practice.

This is glorious in its precision, but also quite magnificently daft.

Let’s face it: this distinction is pointless. The Americans are right to reject it. It serves no purpose other than to make some people feel confused, to make others feel smug, and to waste everyone’s time. Precisely zero confusion would result if we spelt the verbs and nouns the same way.

Why do we in Britain have this distinction? (From what I gather, most other English-speaking countries follow British rules, although Canadian usage leans towards American.)

The rule is often explained by analogy with advice (noun, with a c) and advise (verb, with an s) – or device and devise, or prophecy and prophesy. That’s how I learned which one to use. But with these three pairs, we pronounce them differently too. So it still seems odd.

And many other words manage perfectly well to do double duty as verbs and nouns without needing their endings spelt differently: promise, release, incense, reverse, discourse, divorce, advance, silence, sentence, notice

What’s going on?

Looking into the history, I found that the practice/practise distinction is much older and better-established than the licence/license one. Practice/practise was in place 300 years ago, and for a while the pronunciations did differ, but licence/license was a 19th-century rationalisation that has struggled to catch on. And both distinctions are now weakening.


According to the Oxford English Dictionary, the P-word first appeared in English about 600 years ago. This was, to quote Terry Pratchett, “before they invented spelling”. So early forms are all over the place: practysse, practyce, praktese, practis… and many more!

I looked at the OED’s example sentences over the centuries to see how often spellings with c and s – and even z – were recorded across the centuries. With the caveat that this is just a small selection, there does seem to be a pattern:


At first, the word was mostly used as a verb and mostly spelt with an s. This fits with the etymology, which the OED roots in the Middle French practiser and the Latin practizare. Then the noun use grew, and as it grew it became more likely to have a c, as was more usual for nouns. This in turn influenced the verb spelling, giving it a c some of the time. But by the 1700s, the distinction was pretty firmly there.

The early z spellings – more common for the verb than the noun – are intriguing. Here are a couple:

Pleasanter to practize is this than the former and moste exact for Altitudes. (1560)

He practized the vtter ouerthrowe not onely of all Christian societie, but of the state of the whole world also. (1581)

These hint at a different pronunciation. The OED reports that the verb “was originally stressed on the second syllable”, but later shifted to match the stress of the noun. With –ize endings, you can see how that might have gone.

And take this 1439 poem by John Lydgate, in which he rhymes practised with devised:


Either it was spoken differently back then or Lydgate was just a useless poet.

And there’s more recent evidence.

In the introduction to his 1836 dictionary, Benjamin Humphrey Smart talks about the pronunciation of similar nouns and verbs. He points out that one way we sometimes distinguish the two is with a difference of stress: for example, refuse is stressed on the first syllable as a noun but on the second as a verb. This pattern is far from universal, but there are definitely others: incense, contract, upset, conduct

Then Smart says:

The vulgar, then, are in the right when they say prac’tice [noun, stress on the first] and to practise’ [verb, stress on the second]; but here… the caprice of fashion interferes, and in this one instance obliges us to pronounce noun and verb, though differently spelled, in all respects alike.

Whichever “vulgar” people he had in mind, this shows that an alternative pronunciation for practise was still getting some use as recently as the early 1800s.

If the noun and verb were stressed differently, that would fit with their being spelt differently, too – just like advice and advise. Then, after a while, the pronunciation of the increasingly common noun took over that of the verb, but the spellings – with a body of written evidence to establish them – stayed as they were.

In more recent history, the Google Books data shows that, from 1800 to 2000, the practice spelling of the noun has reigned supreme, with practise very rare. The verb’s spelling has been more mixed, but practise has consistently been well ahead of practice – until recently. This recent rise in practice as a verb might be part of the catastrophic modern decline in literacy that swivel-eyed liberals have inflicted upon our once-great education system, but I think it’s more likely that this is a sign of growing American influence.


The L-word (spelt at first in various ways) is about as old as practice/practise. But the licence/license distinction isn’t. Looking again at the OED’s example sentences, century by century:


Here there’s much less of a pattern. C was more common for both noun and verb at first, and then the s spelling became more common for the verb – but also quite common for the noun.

From these sentences, there’s limited evidence to back up the OED’s claim that the licence/license distinction is “now prevailing usage”. That claim was made in 1902 (the entries for practice and practise were updated in 2006).

The OED of that day seems to have been fighting a battle. It notes that late-19th-century dictionaries “almost universally have license both for noun and verb, either without alternative or in the first place”, but insists that the s spelling “has no justification in the case of the noun”.

Samuel Johnson’s dictionary of 1755 uses s for the noun, although noting some examples that use c:

johnson license noun

The first dictionary I can find to state the distinction is Smart’s of 1836:

smart def

He explains that this is because of the general principle that –ce is a noun ending while –se is a verb ending. This ignores, of course, promise, release, incense, reverse, discourse, divorce, advance, silence, sentence, notice etc.

However, Smart doesn’t observe the distinction himself. Elsewhere in the book he defines apothecary as “a dispenser of medicines, having also a license to practise medicine” and allowance as “Sanction, license, permission”.

Likewise for other members of the 19th-century grammarati. Henry Alford, in his bestselling The Queen’s English (1846), doesn’t observe it:

A curious extension of this license is sometimes found.

I expect we shall soon see “groceress” and “tea-dealeress,” and licenced “vendress of stamps.”

And Henry Sweet, in A New English Grammar (1892), acknowledges both spellings for the noun.

Then the OED, echoing Smart, laid down the law, and from the Google Books data the c spelling of the noun did become more common in the early 20th century. Later in the century, though, the s spelling began rising again – US influence, I’d guess, as with practise.

The spelling of the verb was mixed throughout the 19th and 20th centuries, although from the 1980s the s spelling shot up. There could be two reasons for this: either a modern surge in prescriptivism or growing US influence. The latter fits much better with the other results.

So, at no point in British history has the licence/license distinction been dominant. Its supporters (like Henry Fowler in 1926) did their best, but on the whole people persist in not finding much use for it.

In the 21st century

Most modern British dictionaries still state both distinctions as fact. The only exception I’ve seen is the online Oxford Dictionaries, whose entries feed into the gradual updating of older OED entries. It accepts that licence is now “an acceptable variant spelling of the verb”.

What about recent usage?

As far as I can tell, in edited text, spelling still mostly follows the dictionaries. So for business writing I’d recommend doing the same, for the time being. But more broadly, the distinctions are unravelling – at least in unedited writing.

For that, I looked at GLOWBE, a corpus of online usage from 2012 (while some websites are edited, many aren’t). On British web pages, this is where we are:

  • Practice as a noun remains dominant: it’s more than 20 times as common as practise (comparing searches for the practice and a practice with the practise and a practise).
  • Practice as a verb makes a surprisingly strong showing: you practice, they practice and we practice are all more than twice as common as the practise equivalents. Practised and practising are still more common than practiced and practicing, but not by much.
  • Licence as a noun is only about twice as common as license.
  • License as a verb seems to be about twice as common as licence (although the numbers are small on either side). Licensed and licensing are 15–20 times as common as licenced and licencing.

I think we’re in the middle of a generational shift. Soon, dictionaries will accept practice as a variant spelling of the verb. Then, as it becomes more and more popular, they’ll stop labelling it variant. Practise will survive but seem old-fashioned, like whilst or homoeopathy.

Licence/license will become blurrier: more dictionaries will accept licence for the verb, and then license for the noun (all those “software licenses” we see). Eventually we’ll have a situation like that of adviser/advisor or artefact/artifact, where the choice is a matter of taste. I doubt licence will completely disappear: many of the bodies that issue licences are traditionally minded. No transport minister would ever want to announce to Parliament the Americanisation of driving licence.

Those of us who’ve had the distinctions drilled into our heads will continue to twitch when we notice a “wrong” spelling, but in time we’ll die out. The earth will close over our heads and English will live on, that bit more efficient for being rid of us.

Contractions: which are common and which aren’t?

Contractions – “you’re”, “we’ll”, “can’t”, “don’t” etc. – generally make language sound less formal, and avoiding them makes it more formal.

Over the years, contractions have become more acceptable higher up the formality spectrum. This is part of a general shift that’s been going on for decades: styles of language that were once firmly seen as casual are now more widely used in more businesslike contexts. Likewise, styles of language that would have been common and neutral in, say, the 1950s now tend to come across as very formal.

On the whole, your best bet is to trust your judgement. Use contractions or not depending on whether you feel comfortable saying the phrase that way, in that sentence, in that context, for that audience. But take care: if your tastes are more old-fashioned or new-fangled than your audience’s, you may miss your mark.

And if you’d like some evidence to double-check your judgement against, I have plenty – from COCA, the Corpus of Contemporary American English.

COCA is a database of English use – 450 million words’ worth – from a wide range of sources. It covers fiction, newspapers, magazines, academic journals, and transcripts of unscripted TV and radio, from 1990 to 2012.

I searched COCA for 77 contractions and their spelt-out counterparts (there are others, but life is short). For instance, I searched for “didn’t” and “did not”, recorded the number of uses per million words, and then divided the “didn’t” number by the “did not” number.

This gives the relative frequency of use for each contraction: how common each is relative to its spelt-out version. A small number means a phrase is rarely contracted, 1 means both versions are equally common, and more than 1 means it’s usually contracted. The more common a contraction is, the more comfortable you can feel about using it.

Before we get into the detail, here are two top-level findings, averaged across all 77 contractions. First, the trend over time:

cont time

Over just a couple of decades, contraction use has increased.

Second, the differences between different kinds of source (averaged across 1990–2012):

cont sources

No surprises here. Contractions are extremely common in speech and fiction, quite common in magazines and newspapers, and less common in academic journals.

The next tables, getting into the detail, only cover non-fiction writing. This is because I think people are generally happy judging when to contract phrases in speech (and fiction, for those who write it). So from here on, I’m only looking at newspapers and magazines (averaged together, as their overall results are similar) and at academic journals.

Here are contractions ending in “–n’t”:

cont not

The lower the frequency, the more cautious it’s wise to be. But even for common contractions, there will be sentences where it’s better to spell them out. And for rarer contractions, there will be sentences where they work better.

The other common kind of contraction is the sort that joins a pronoun and a verb of the “be” or “have” families or “will”, “would” or “had”.

Here it gets a bit tricky:

  • “Would” and “had”. “I’d” can mean “I would” or “I had”. So to get meaningful results, I had to search for slightly longer phrases. I compared “I would be” with “I’d be” and “I had been” with “I’d been”. Likewise for other pronouns.
  • “Is” and “has”. “It’s” can mean “it is” or “it has”. So I compared “it is being” with “it’s being” and “it has been” with “it’s been”. Likewise for the pronouns “he”, “she”, “who” and “that”. This doesn’t work for “there”, because “there is being” doesn’t really get any use. So I compared “there’s been” with “there has been” and “there is a” with “there’s a” (“there has a” is pretty much non-existent).

These are the results:

cont new&mags

cont journals 2

“Be” verb phrases are the ones most often contracted. In newspapers and magazines, the other kinds (especially “will”) are also often contracted. “Is” is contracted more often than “has”, and “would” is contracted more often than “had”. In journals the pattern is mostly similar but the numbers are smaller.

In both tables, “you” phrases are the most often contracted, followed by “I” and “we”. First-person writing tends to be more casual, and writing that addresses the reader in the second person even more so. Phrases using other personal pronouns – “he”, “she” and “they” – are next-most-often contracted. “It” and “there” phrases follow, although most of the contractions there are “it’s” and “there’s”. Bringing up the rear are “who” and “that” phrases (with a strong showing from “that’s”).

I looked at a few other contractions that don’t fit either of the above groups.

“Should’ve”, “would’ve” and “could’ve” are pretty rare, with relative frequencies of 0.02–0.03 in newspapers and magazines, and 0.00 in journals. “Let’s” (for “let us”) is very common, scoring 3.79 in newspapers and magazines and 0.70 in journals.

I’ll end by quoting some recent usage manual and style guides with advice that, in light of this data, seems fair:

Contractions of the type I’m (= I am) and don’t (= do not) are exceedingly common in informal and online writing and increasingly found in various kinds of fairly formal contexts (e.g. in book reviews).

– Fowler’s Dictionary of Modern English Usage (2015)

In the past they were felt to be too colloquial for the written medium, and editors of academic journals are still inclined to edit them out. The writers of formal documents may feel that they undermine the authority and dignity of their words. But the interactive quality that contractions lend to a style is these days often sought, in business and elsewhere. They facilitate reading by reducing the space taken up by predictable elements of the verb phrase, and help to establish the underlying rhythms of prose.

– Cambridge Guide to English Usage (2004)

Many writers, especially those who write in formal situations, feel uncomfortable with contractions. And perhaps contractions don’t generally belong in solemn contexts.

But why shouldn’t writers use them in most types of writing? Some excellent writers use contractions to good effect, even in books…

The common fear is that using contractions can make the writing seem breezy. For most of us, though, that risk is nil. What you gain should be a relaxed manner – not breeziness.

– Garner’s Modern American Usage (2009)

Sometimes, lots of ‘cannot’, ‘should not’ etc can seem archaic and formal. That’s a tone we can move away from without jeopardising the overall tone of information coming from government.

Writing for GOV.UK (2015)

Most types of writing benefit from the use of contractions. If used thoughtfully, contractions in prose sound natural and relaxed and make reading more enjoyable. Be-verbs and most of the auxiliary verbs are contracted when followed by not: are–aren’t; was–wasn’t; cannot–can’t; could not–couldn’t; do not–don’t; and so on. A few, such as ought not–oughtn’t, look or sound awkward and are best avoided. Pronouns can be contracted with auxiliaries, forms of have, and some be-verbs. Think before using one of the less common contractions, which often don’t work well in prose, except perhaps in dialogue or quotations. Some examples are I’d’ve (I would have), she’d’ve (she would have), it’d (it would), should’ve (should have), there’re (there are), who’re (who are), and would’ve (would have). Also, some contracted forms can have more than one meaning. For instance, there’s may be there is or there has, and I’d may be I had or I would. The particular meaning may not always be clear from the context.

– Chicago Manual of Style (2010)

The triumph of the linguistic free market

The Idler’s annual Bad Grammar Awards are not about ridiculing grocers. They are about promoting the public discussion of grammar and, perhaps more importantly, exposing cant and humbug.

This is a noble aim and I am proud to support it.

By chance, I myself noticed a fine dose of cant and humbug the other day, in the Idler’s announcement of its Bad Grammar Awards.

It singles out Oliver Kamm’s “utilitarian approach to grammar”, saying:

He reckons that if a mistake is made enough times, then it is no longer a mistake. We don’t agree: we reckon grammar is more like the law. There is a set of agreed rules but the rules change over time. Both are based on a principle: with the Law the principle is fairness; with grammar the principle is clarity. We need a common language in order to be able to communicate.

Grammar – and, like so many pontificators on the topic, by grammar they mean language in general – is not like the law. Language is a set of social customs.

The law is defined by a particular group of people whose job it is to define the law. No such group exists for language, apart from the entire English-speaking population.

The “agreed rules” of English are agreed only tacitly, although we can try to codify them. And, at any one time, many of them cover only part of the population, or only some situations. The agreements change as more and more individuals change their linguistic habits; there is no moment of decision when a change is agreed.

The principle is clarity – and the people best placed to judge clarity are the people. All of us. We are the ones reading, writing, talking, listening. If we find that one way of using words helps us communicate more clearly than another, then we will favour the better way. There would be no point in having some committee of the great and the good to ponder its way towards these decisions on our behalf.

Most people who complain about language as the Idler’s judges do aren’t making a linguistic analysis, based on fact and reason. They’re just venting their conservatism.

I find it sad, and a little puzzling, that conservatives find it so hard to grasp that language is controlled by market forces. Nobody is in charge, and that makes our language far more dynamic, efficient, and rich.


Update: John McIntyre develops the ‘language as market’ analogy:

Some products stay on the market for a very long time. Others are periodically rebranded. New products pop up, have a vogue; some of them remain on the shelf, but most fail to find steady customers. There are upmarket products and downmarket products. The result is the product of innumerable individual choices.

Language is thoroughly democratic.

That’s exactly what I was (less articulately) thinking. Except for maybe the last bit.

In a democracy, everyone – in theory – has an equal vote. But in a market, we’re not all equal.

Some people are better than others at manufacturing usages, or at least marketing them. And some have more purchasing power than others. Those whose language is the most prominent – such as famous people or major publishers – have more influence on the language market, and can affect the fortunes of certain usages by buying into them.

Market success doesn’t necessarily reflect the intrinsic quality of a product – remember VHS vs Betamax? But when consumers choose products based on compatibility, the idea of intrinsic quality breaks down.

Sometimes a well-designed product will go bust. The people who backed it may try to keep it alive, but if the buyers lose interest there’s not much to be done. You can’t buck the market.