The confessions of a recovering pedant

I love the English language, and I love being right. So I became a copy editor. But I grew to realise that my concept of rightness was wrong – and that if I truly loved the language, I’d have to change.

The rise and fall of my mental rulebook

I’ve always been this way. At school, I was the kid who got asked to spell difficult words to save my classmates a trip to the dictionary. In my first job, I was the guy who got given reports to proofread before they went to the client. I played these parts with quiet relish.

But I wanted more.

And when I started getting into editorial work, I knew I needed to be more. My instincts needed to be sharper and my arsenal bigger. So I scoured books and websites for grammatical constructions to avoid, semantic distinctions to maintain. I built an orderly system of rules inside my head – a monument to my love for English and a guide to make sure I treated it right.

Use ‘that’ and not ‘which’ to introduce a restrictive clause. Use ‘affinity’ followed by ‘between’ or ‘with’ but never ‘to’ or ‘for’. Use ‘comprise’, without a preposition, to relate the whole to the parts and not vice versa. Use ‘compare to’ only when likening things. Use ‘disinterested’ for impartiality and not unconcern. Use ‘data’ in the plural. And please, use ‘beg the question’ only after consulting a philosopher.

Rules to make the language strong. Rules to keep it safe. Rules to respect logic and tradition.

For a time, I was happy. But slowly, two shadows fell over me.

The first was all the people – educated, intelligent, articulate people – who didn’t know about many of these rules. And yet they could still speak and write clearly, even powerfully and beautifully. So what exactly were those rules for? Who were they for?

The second came from the opposite direction. I’ve never been able to take seriously some so-called ‘rules’: don’t split infinitives; don’t end a sentence with a preposition; don’t start a sentence with a conjunction; don’t use ‘since’ or ‘as’ to mean ‘because’; don’t use ‘between’ with more than two things; use ‘jealous’ to denote  protectiveness and not envy. These seemed archaic, arbitrary and irrelevant to how English really works. When someone insisted on one of these ‘rules’, I’d shake my head and wonder how they got themselves into such a state.

I was sorting the real rules from the bogus – using only my own judgement about what made sense and what felt right.

Not good enough.

So all of this led me to the question that unravels everything: what determines the rules of the English language?

Usage and its malcontents

We all know where the rules of English come from. They come from whichever English teacher we had when we realised that the cool kids were never going to like us and so we might as well start paying attention in class. Or they come from whichever dictionary or usage guide we have on our bookshelf when we feel the need for authority.

Of course, these are just intermediaries – but we rarely stop to think about where the rules ultimately come from.

They aren’t divine commandments. They aren’t logically derived truths like those of mathematics. And they aren’t decreed by a committee of wise old scholars in a dusty room in Oxford. Any authority has to draw its authority from somewhere else – and so on and so on.

The only place to stop is at the bottom, with the community that uses the language. Ordinary usage is a real, measurable thing that exists without needing justification. If rules are genuine, it’s hard to see where else they could come from.

Just ask a lexicographer. A dictionary can be a useful guide only by describing the territory as it is. When a new word or meaning gets added, it’s not because the compilers have given us a new creation. It’s because they’ve noticed people using the word that way and they’ve recorded it. The same goes for grammar: people put words together in certain ways, some of which become widely accepted and others don’t. And what’s accepted varies.

Some people don’t like this idea. They think a lot of common language is pretty poor. They fear that admitting rules to be bottom-up would mean that ‘anything goes’. So, while conceding that language comes from the people, they think we should still try to moderate and refine the vagaries of popular chatter.

Bryan Garner says that, while usage is “the overarching criterion for correctness”, it isn’t the only one. He gives five criteria for judging whether a word or phrase is “undesirable” and therefore to be opposed even if it’s widely used:

(a) it sounds newfangled
(b) it defies logic
(c) it threatens to displace an established expression (but hasn’t yet done so)
(d) it originated in a misunderstanding of a word or its etymology
(e) it blurs a useful distinction.

I partly agree. These are fair reasons to dislike a term, although (a) is subjective and (d) may be ancient history. But I don’t agree that a wise, dispassionate expert should apply these to trump the usage of the masses. This is because the criteria are already factored in. People – ordinary people, not just language obsessives – do tend to avoid language that offends on these grounds. So usage reflects these criteria, as applied by the community.

Take ‘irregardless’, probably the most hated word in the English language. It ticks (a), (b), (c) and (d): its double negative is an illogical innovation that misunderstands how prefixes and suffixes work, and it treads on the toes of the perfectly good ‘regardless’. It’s the Egyptian cat goddess of pet peeves.

But people really don’t use it much.

I looked in Brigham Young University’s Corpus of Global Web-Based English, a catalogue of text from 1.8 million randomly selected web pages. Uses of ‘regardless’ outnumber ‘irregardless’ by 93,830 to 534 (176 to 1). And of those 534, more than a fifth are in discussions of the word’s use.

Even on the web, the international home of sloppy writing, ‘irregardless’ is rarely used – and when it is, it’s often in quotes, as the object of complaint.

Most ordinary people avoid it for the same reason most language obsessives hate it: it offends on so many of Garner’s counts. It feels wrong. So any expert application of those criteria disregards and overrules the unreflective judgements that people make anyway.

And yet… I like to think my judgement is better. Not because I’m a better person, but because I take the time to consider these things. I worry about the (mostly unappreciated) distinction between ‘historic’ and ‘historical’, and about the (increasingly rare) use of ‘fulsome’ to mean ‘insincerely excessive’. I feel that people like me should be justified in standing up and fighting when we see something useful in language at risk of being lost.

But this is where the idea of usefulness in language breaks down. Useful to whom?

Compatibility and communication

In about 1990, my family sadly threw away our collection of Betamax video cassettes. They were more compact than VHS cassettes and with higher picture quality. But VHS’s marketing and pricing had given it the edge and once it was ahead, its popularity grew, feeding on itself. People wanted to have the same format as their friends. Shops didn’t want to stock two versions of the same product.

The format war was won by compatibility. The inherent quality of tapes mattered less than how widely they could be used, so the Betamax market shrivelled to nothing. Use defined usefulness.

So too in language. I’m free to use ‘fulsome’ to mean ‘insincerely excessive’ – but if I do, my language won’t be compatible with a lot of other people’s.

I could try to buck the linguistic market, promoting a declining usage by my own example. It’s a fine ambition. But my writing (and even the writing I edit) is a tiny fraction of my audience’s reading. How are they to know that my usage is the one they should embrace, forsaking all others? We can each do our tiny bit, I suppose. But is that worth the risk of distracting them and detracting from the message I’m meant to be communicating?

I’ve sometimes done just that. For a couple of years, this was my out-of-office autoreply:

Thank you for your email. I shall be out of the office until [date], and I will reply to you as soon as possible when I get back.

‘I shall’ for the simple statement of fact, ‘I will’ for the emphatic promise of action. A deft observance of a subtle distinction that has been observed by deft, subtle people like me since John Wallis first decreed it in his 1653 guide to English grammar. (The distinction, for reasons I’ve never understood, goes the other way in the second and third persons, so ‘you shall’ and ‘he/she/it/they shall’ carry more emphasis than ‘you will’ etc.)

But, whether or not this rule was obeyed in centuries past, it’s clearly niche today. So my email’s shall–will switch will have meant nothing to most of my readers. Maybe a few appreciated it, but I can only hope the rest didn’t notice. If they did, it will have seemed odd and maybe a little pompous.

The point of language is to communicate. If you want to succeed, your language needs to be compatible with your audience’s. If they don’t recognise a distinction that your message relies on, you will fail.

A rule that doesn’t reflect usage can get in your way. Yes, English might be better if everyone followed certain extra rules known only to a few of us. And my commute to work might be better if it only took two minutes. But I’d be a fool to leave the house at 8.58.

The unwanted legislators of the world

Rules that don’t reflect usage exist because certain people think they know better.

Take this one: just as you say ‘some of the’ and not ‘some the’ (and likewise for ‘none’, ‘lots’ and ‘most’), you should say ‘all of the’ and not ‘all the’.

It’s pretty typical of its kind: it has a logical-looking rationale, it aims to increase consistency, and it’s unknown to most people. In fact, hardly anyone has heard of this rule. I made it up. I don’t remember how the idea came to me, but when it did it seemed reasonable, so I upheld it (for a while).

I can’t claim I’m the only person ever to think of or apply this rule, but I can’t find anybody who has; in fact, the pundits who do have a preference favour ‘all the’ on grounds of brevity. And to oppose ‘all the’ in favour of ‘all of the’ is to defy standard usage going back centuries.

Oops.

But it truly didn’t occur to me that the rule was my invention. I didn’t think I was bending grammar to my whim. The rule just struck me as a simple way to promote order and consistency, which seemed self-evidently a good thing.

Luckily, I gave up on it before too long. But if I’d asserted it confidently in front of anyone impressionable, or if I’d put it into our house style guide at work, or if I’d been a notable grammarian of the 18th or 19th century…

This is how these things get started. Three real examples (from Merriam-Webster’s Dictionary of English Usage):

  • Say ‘different from’ and not ‘different to’ or ‘different than’. All three were in common use by the early 1600s. But in 1770, Robert Baker argued that because we say ‘differ from’, for consistency we should also say ‘different from’; the other two were inferior. His logic caught on among other commentators, and their descendants prey on the unwary to this day.
  • Do not use ‘verbal’ to mean ‘spoken’. The OED finds this forbidden usage going back to 1591; it’s recorded in Samuel Johnson’s 1755 dictionary and in Noah Webster’s of 1828. But in 1881, William Ballantyne Hodgson cited etymology: ‘verbal’ is from the Latin ‘verbum’ (‘word’). So, he decided, it should strictly mean ‘in words’ – whether spoken or written. Others took up the cause, but most others are unaware that the cause even exists.
  • Do not end a sentence with a preposition. John Dryden in 1672 criticised the poetry of Ben Jonson for doing this. He didn’t say why it was a fault, but he seems to have taken his cue from the rules of Latin. In Robert Lowth’s popular grammar book of 1762, this fancy became a principle: that terminal prepositions were idiomatic and not fit for formal language. Later books by Hugh Blair, Lindley Murray and others strengthened the condemnation. Despite counter-attacks from 20th-century commentators and obliviousness from ordinary people, the rule clings to a semblance of life.

It isn’t clear why some ‘rules’ catch on and others don’t. Don Chapman surveyed a dozen usage guides and found that, on average, 14% of the entries in each book didn’t appear in any of the others. Different commentators worry about different things. But even the things they agree on don’t necessarily reflect general usage. Chapman suggests that a dislike can catch on among these writers if the disliked usage is common enough to be noticed but not so common that it seems pointless to fight, and if the dislike can be rationalised on grounds such as Garner’s criteria.

Commentators may persuade their peers and successors to take up arms, but unless they’re paying careful and unbiased attention to usage, their consensus may be isolated from the real language.

What’s right and what works

How language is used varies and what’s accepted is a matter of degree – facts that don’t fit well with the concepts of right and wrong. And it’s easy to get tied up in futile arguments about what is and isn’t a rule of grammar or the ‘real’ meaning of a word. Rules are like inflatable armbands: they’re helpful for learning, but when you know what you’re doing they just get in the way.

So let’s forget about correctness and instead judge language by how effective it is. How well does a certain piece of language achieve the desired effect on the intended audience? What sort of language will work best in a given context?

Language can be ineffective for all sorts of reasons: people can find it too stuffy, abstruse, archaic, colloquial, simplistic, faddish… any of these can make a reader lose interest and a writer lose readers. Perceiving ‘incorrectness’ is just another bad reaction, and there’s no practical gain in setting it apart from all the others. Everything useful that gets filed under correctness also comes under effectiveness.

Forgetting about what’s right doesn’t mean that ‘anything goes’; by looking at what works, we can see more clearly that different things go in different places and most things go nowhere.

I’ll illustrate with some examples from Geoffrey Pullum and Rodney Huddleston’s Cambridge Grammar of the English Language:

  1. It is clear whom they had in mind.
  2. It’s clear who they had in mind.
  3. Kim and I saw the accident.
  4. Kim and me saw the accident.
  5. I have not seen them.
  6. I saw not them.

Sentences 1 and 2, they say, and are both perfectly standard but differ in register: 1 is more formal than 2. By contrast, 3 does but 4 doesn’t follow the grammatical conventions of standard English, although 4 is fine in many other dialects. And the difference between 5 and 6 is straightforwardly grammatical: 6 is not accepted in any dialect, and no adult native English speaker would say it.

I don’t dispute these categories of difference, but there are people who would, insisting that 2 and 4 are simply always ungrammatical. And while Pullum, Huddleston and others ably oppose this view, I don’t need to get involved. I just need to choose which of each pair works better.

1 or 2: People with very conservative, formal tastes may prefer 1 but most others will find 2 more natural. If my audience is a mix, I try to judge what the balance is and whether I want to risk the first group finding me uncouth or the second group finding me pretentious.

3 or 4: A similar trade-off applies, although among the general population, 3 will be seen as the ‘right’ option much more than 1 was over 2. But I need to think about the dialect of my audience – and, at least as importantly, the dialect they expect to hear from me. Anything that sounds fake will sound bad.

5 or 6: Everyone will think 6 is ungrammatical, and that’s all the reason to avoid it. Adding that it is ungrammatical adds nothing.

Likewise for other things, such as spelling and word choice, that are most obviously seen as matters of correctness. If I write ‘neccessary’, most readers will think I’m careless or semi-literate. If I write ‘should of’, some readers won’t bat an eyelid, but many will frown and some will have violent convulsions. The perception of wrongness is what counts. There’s no need to decide whether the perception is right.

True, in these cases the decision is easy, but in others what’s right and wrong is debated – and the debate is a waste of time.

If I’m writing for an audience that hates split infinitives (it’s likely to be a tiny audience), I’ll be tempted to rebel. But that could make my message less effective, so my best bet may be gently to humour them.

A larger general audience may include a few such people. If I placate them, they’ll be more receptive to what I’m saying. But other people can find some unsplit infinitives a bit stilted. This is a weaker reaction than annoyance at ‘bad grammar’, although a more common one. So I may have to choose which bad reaction I prefer. In practice I’ll usually split to avoid stiltedness, but it could be a legitimate tactical choice not to.

I might think a certain ‘rule’ is a ridiculous superstition believed only by bad-tempered oddballs. But if bad-tempered oddballs are in the audience, they count as much as anyone else. They don’t get a veto, but they do get a vote.

Thinking about what works and not what’s right is also a practical way to handle words and phrases whose meanings aren’t consistently understood. If I use ‘biannual’, then unless I contextualise it very carefully, some readers will think I mean twice a year, some will think I mean every second year, and some will wonder which I mean. If I can’t reasonably fit contextual cues in, or if I don’t have the time to try, then for a general audience I probably won’t use it.

If want to use ‘crescendo’ to mean the build-up rather than the climax, or to use ‘infamous’ to imply immorality rather than controversy, or to hang an important point on the distinction between ‘may have’ and ‘might have’… I’ll stop and think of my audience and how they’re likely to understand these terms.

Satisfying myself of my own correctness gets me nowhere if my aim is to communicate.

How to do it

There’s a practical problem with my approach. I may not know how many of my audience have what opinion about ‘whom’ or ‘disinterested’ or whatever it is. This is easier to handle, though, than the problem of adjudicating what’s correct and on what grounds.

Short of spending an implausible amount of time and money running surveys, a reasonable assumption is that what’s generally used is what’s generally accepted. Given this, there are resources that can help:

It’s also important to read and listen widely, to keep in touch with the varieties of language that different audiences might prefer.

The right way to be right

It can be hard to give up on the idea of correctness. Even though effectiveness is a more useful idea, there’s a pleasure in being right. I’m still vain enough to enjoy thinking of myself as an elite guardian of linguistic wisdom.

If you’re like me and you don’t want to give that up, you don’t have to. You just have to change your strategy, and you can be more powerfully right about language than ever before. The thing to remember about being a know-it-all is that you do need to know it all – or at least more than anyone else in the room.

Imagine a social event of some sort, at which you find yourself talking to Frank and Verity. The three of you are only slightly acquainted.

Frank, talking about a recent career move, says: “I wasn’t sure about the new position, but my manager convinced me to apply for it.”

At this, Verity interjects, with her face sculpted into a look of what is surely sincere helpfulness: “Actually, I think you’ll find that should be ‘persuaded’. Strictly speaking, you convince someone of a fact or that something is a fact, but you persuade them to do something. Mixing the two words up is an increasingly common mistake, but this is a long-established distinction.”

This is what you would once have said. But now you’ve ascended to a higher plane. So, just as Frank is about to humbly thank Verity for the generous lesson, you make your move:

“Actually, I think you’ll find it’s a bit more complicated than that.

“For one thing, there’s the question of whether the distinction is about meaning (to convince is to induce to believe, to persuade is to induce to act) or about syntax (‘convince’ goes with ‘of’ or ‘that’, ‘persuade’ goes with ‘to’). Usually these two distinctions go together, but you can see the lines blurring in phrases like ‘she persuaded me to believe it’ and ‘she convinced me that I should do it’.

“Then there’s the question of whether the distinction historically is symmetrical – and it isn’t. ‘Persuade’ has been used both ways for centuries: its use with ‘that’ and ‘of’ to mean ‘induce to believe’ is older than Great Britain. A few unhappy people muttered about this in the 19th century, but even back then the usage was far too well-established to shift.

“But the (action-related) use of ‘convince to’ is a 20th-century innovation. It was appearing in print by the 1950s – and ever since then, there have been people complaining about it. Intriguingly, some of the complainers have resurrected the earlier mutterings against ‘persuade of/that’, apparently in pursuit of that mythical symmetry. But, despite the complaints, ‘convince to’ has become more and more common.

“Does that make it correct? Well, the dictionaries, taking their cue from the public, mostly accept it: you can find ‘convince’ and ‘persuade’ in each other’s definitions. Merriam-Webster’s usage dictionary says that ‘convince to’ is fully established. Most of the American Heritage Dictionary’s usage panel had come round to it by the mid-1990s. Even Bryan Garner, who personally recommends against it, accepts that he’s in a small minority as its use has become virtually universal.

“So, if most people use ‘convince to’ and most people accept it, it’s mostly going to be fine if you use it. But it’s worth knowing that there are people who don’t like it, and if you use it in front of them, it may detract from what you’re saying. But, Frank, you had no reason to think that Verity or I might be in that minority, so your use of it was perfectly reasonable, even if it did happen to prove a little disruptive. Another drink, anyone?”

A moment of stunned silence. Frank and Verity, who are both definitely still there, are agog.

And then you bask in a glory far greater than that which Verity had sought. Frank gazes in awe at you, sage and saviour; Verity yields to your superior learning with an admiring smile. You shall surely have your pick of their beds tonight.

My life is like this now (in theory). And you too could put yourself in the delicious position of correcting the corrector.

This is the right way to be right about language: find out the facts about usage and disputes, and judge words by their effect on the audience.

I used to be a terrible pedant. But now I’m much better.

Advertisement
Both comments and trackbacks are currently closed.

Comments

  • Joy  On July 22, 2013 at 10:27 am

    Delightful article, perhaps even Frank & Verity would concur! And despite reacting with outrage to some peoples’ use of who and whom (there are RULES, folks!!!), to the word “liaise” (it is NOT a verb!!!), and to the endless debate over different from/to/than (it’s clearly related to where you grew up and all are fine, even if I do mentally convert “different than” to “different from” every time I hear it), I totally agree with you!

  • Jonathon Owen  On July 22, 2013 at 5:24 pm

    Great post, Tom. It’s the kind of thing I wish I had written myself. (I actually have something along these lines in the works for Visual Thesaurus, so I’m sure I’ll end up quoting you.)

    • Tom Freeman  On July 22, 2013 at 5:32 pm

      Thanks, Jonathon. You actually indirectly contributed to it, as you’re one of several people whose blogging has helped to shape my thinking over the last few years. I would have done some naming, but I didn’t want it to get all Oscar speechy at the end (or to risk missing anyone out).

  • Jonathon Owen  On July 24, 2013 at 4:30 am

    Oh, and I forgot to mention that Don Chapman was my thesis advisor. I loved working with him. We always had the best discussions about usage and prescriptivism.

  • ProsWrite  On August 13, 2013 at 2:49 pm

    Thanks for this, Tom. You’ve written my basic story of recovery, too. The same two clouds formed over me during my time as an undergraduate student back in the 1980s. Luckily, I enrolled in a linguistics course — without knowing what the consequences would be — and my understanding of language rules changed forever. The world is so much more interesting in recovery!

  • pipmarks  On March 10, 2014 at 10:09 am

    Most editors consider ‘got’ to be a weak word (at best), don’t they? I’m surprised that a self-confessed pedant would choose to use it multiple times in a post. Or is this part of your recovery?

    • Tom Freeman  On March 10, 2014 at 1:01 pm

      There are certainly times when it can be weak, but I’m satisfied with my use of it here. I’m sure there are plenty of things I could have improved if I’d spent (even!) more time revising, but this doesn’t stand out for me.

      • pipmarks  On March 10, 2014 at 1:24 pm

        Sorry, my comment was not meant as a criticism. I was always told no split infinitives (which you debunked), no double negatives (even though I couldn’t resist one in my other comment), and never say ‘got’ (so I just wondered if attitudes had changed). Editing seems to be more relaxed now – which is a good thing. I liked the sensible rules you included re split infinitives.

      • Tom Freeman  On March 10, 2014 at 7:02 pm

        No offence taken! I hadn’t come across a generalised “don’t use ‘got'” before. I guess the people who teach us these things each have their own favourite bugbears that they picked up from wherever, and that they pass down to us as if hard-and-fast rules. There are things it’s taken me a while to exorcise from my brain!

  • Tom Freeman  On May 6, 2014 at 12:33 pm

    I’ve just corrected “We all know where from the rules of English come from” to “We all know where the rules of English come from”. I have no idea how that first “from” got there, and I’m aghast that I missed it, but I’m grateful to barney67 for pointing it out.

Trackbacks

%d bloggers like this: