Our survey says…

A YouGov poll showed 1000 Americans some sentences and asked them to pick which one of each pair or group was grammatically correct.

These were the results:

  1. My oak tree loses it’s leaves in autumn. – 31%
    My oak tree loses its leaves in autumn. – 67%
  2. The dogs are happily chewing on they’re bones. – 4%
    The dogs are happily chewing on their bones. – 89%
    The dogs are happily chewing on there bones. – 4%
  3. I think you’re very smart. – 91%
    I think your very smart. – 6%
  4. I consulted an attorney whom I met in New York. – 61%
    I consulted an attorney who I met in New York. – 32%
  5. I don’t trust fruits and vegetables which aren’t organic. – 24%
    I don’t trust fruits and vegetables that aren’t organic. – 68%
  6. The beneficial affects of exercise are well documented. – 17%
    The beneficial effects of exercise are well documented. – 77%

The results are noteworthy, but the poll has two weaknesses.

One would have been painfully easy to fix: adding a “both” option. While 1, 2, 3 and 6 are unarguably testing common mistakes (albeit of punctuation and spelling rather than grammar), 4 and 5 are contested usage issues. Forced to choose one or the other, some people will have leaned towards a vaguely remembered rule that they may not really care about. It would have been good to know how many people think either option is fine.

The second problem is deeper: if you set up a test like this, telling people that there’s only one right answer, their reactions will become unnatural. They will turn their prescriptivism up to maximum and worry about things they would normally happily ignore when reading or in their own usage.

A few remarks on these results:

Far more people got 1 wrong than the similar 2 and 3. This suggests that the lack of an apostrophe in possessive “its” is confusing – aren’t we supposed to use apostrophes to show possession? Pesky illogical English.

From 4, we can see that “whom” isn’t overwhelmingly regarded as necessary for an object. As for the third of people preferring “who” here – I wonder how many were answering as asked, about correctness, and how many were just picking the level of formality they preferred.

The answers to 5 show that the campaign to ban restrictive “which” has made an impression on people (at least in the US). But again I stress that we don’t know whether these answers reflect actual usage and firmly held opinions or just “oh yeah, I think I’ve heard that you should use ‘that’”.

Ask a stupid question

Geoffrey Pullum sifts through the contents of alleged “grammar” quizzes. He is scathing:

if quizzes on chemistry were as uninformed as those on grammar, they would ask silly questions on peripheral topics (“Who is the Bunsen burner named after?”), and would make no reference to the periodic table, or atoms or molecules. The web’s grammar quizzes deal in minor pieces of puristic flotsam with roots in dimly understood 18th-century grammatical analysis.

Why are these quizzes so bad? He rightly blames the incompetence of the setters and their obsession with “fiddly details, gotcha points, and timeworn myths”. But there’s a problem on the other side too: while a proper linguist like Pullum could write a proper test on the features of English grammar, most of the people who took it would struggle. He notes that “few educated people know anything about grammatical analysis”.

We are, most of us, lacking in the theory of grammar. We might vaguely remember a few schoolroom basics, like “a noun is the name of a person, place or thing”. But this is about as useful a definition as “a bird is an animal that flies”. Sure, it identifies the prototypical cases, and it can get a child into the right sort of territory to begin to understand, but the truth is much more complex.

Look at the underlined nouns in this example of Pullum’s:

There’s a great deal of unclarity about what the heck would happen in a crisis during your absence, and there’s every chance of that, for heaven’s sake!

No meaningful definition of “thing” can accommodate these. Instead, you have to define grammatical categories by their grammatical properties. But this knowledge is uncommon.

On the other hand, if you want to test people’s practical grasp of grammar – well, that’s easy. But pretty pointless.

Recoiling from complexity

Nathan Heller has critically reviewed Steven Pinker’s The Sense of Style. His main complaint is that Pinker’s advice includes too much leeway based on context and subjective judgement and too little respect for the “basic principles of English grammar”. For example:

English is complex. To help reduce ambiguity, modern usage attaches specific words to specific functions. The restrictive-nonrestrictive division between “that” and “which”—two particularly common and shifty words—is one attempt at clarity.

Heller cites “usage”. To remind you: usage is what people do with language. So, do people use ‘that’ to open restrictive clauses and ‘which’ to open nonrestrictive ones? Some do, but many don’t.

A great little study by Jonathon Owen, for his Master’s thesis at Brigham Young University, looked at the kinds of changes copy editors make. He reviewed edits made to manuscripts written by academics and found one of the commonest to be imposing the “the that/which rule, proscribing the use of which as a restrictive relative pronoun”. This tells us that this distinction may be widely observed by copy editors but less so by even educated writers of formal prose – and presumably less still by the public as a whole.

The American Heritage Dictionary (AHD) goes further, noting that “this use of which with restrictive clauses is very common, even in edited prose”.

So yes, people have attempted to create this distinction: EB White recommended it, so did HW Fowler, so does Garner’s Modern American Usage. And dedicated copy editors have done their best to make usage follow this distinction (including me, until I wised up). But usage persists in going its own way (Garner concedes that restrictive ‘which’ is ubiquitous). And no lack of clarity has resulted, because we use commas to highlight nonrestrictive clauses.

Heller has further concerns:

Another is the rule that “like” joins noun phrases, while “as” or “as if” is for verb phrases. (“It looks as if my date is here!” “You look like Mom in that dress.”) Pinker doesn’t see the point of that one, either.

His grammar terms aren’t quite right here: “my date is here” is a clause, not a verb phrase; “is here” is the verb phrase.

Again, this is a rule insofar as people like Heller insist on it, but by the standards of usage there’s far more leeway, depending on the situation. The AHD again:

They don’t make them like they used to. I remember it like it was yesterday. As these familiar examples show, like is often used as a conjunction meaning “as” or “as if,” particularly in speech. While writers since Chaucer’s time have used like as a conjunction, the usage today has a somewhat informal or conversational flavor. Language critics and writing handbooks have condemned the conjunctive use of like for more than a century, and in accordance with this tradition, like is usually edited out of more formal prose.

But it adds:

Like is acceptable at all levels as a conjunction when used with verbs such as feel, look, seem, sound, and taste: It looks like we are in for a rough winter.

And Garner accepts that conjunctive ‘like’ is “acceptable casual English”, although not yet “unimpeachable”. Again, genuine confusions caused by using ‘like’ in this way are vanishingly rare. Heller raises another issue:

It’s for grammatical consistency, not beauty or gentilesse, for example, that correct English has us say “It was he” instead of “It was him.” Pinker calls this offense “a schoolteacher rule” that is “a product of the usual three confusions: English with Latin, informal style with incorrect grammar, and syntax with semantics.” … [But] the nominative and accusative cases are the reason that we don’t say gibberish like “Her gave it to he and then sat by we here!” No idea is more basic to English syntax and grammar. In the phrase “It was he,” “it” and “he” are the same thing: they’re both the subject, and thus nominative.

I agree: it definitely isn’t for beauty that anyone says “It was he”. It sounds so prissy I can hardly bear to type it.

And Heller is right about the difference between nominative and accusative, but then he misapplies it because of another grammatical misunderstanding. In “It was he”, “It” and “he” are not both the subject; “he” is the object.

You can see this more clearly in the grammatically equivalent “It was they”. Subjects govern verbs, and here, the singular subject “it” gives us the singular form of the verb “was”. Then “they”, being plural, is obviously not the subject. Heller’s mistake is precisely the confusion of syntax with semantics that he quotes Pinker warning him against. The two pronouns have the same referent but not the same grammatical role.

All this nit-picking is cracking good fun, but it’s beside the point. According to usage, “It was he” lives happily enough at the formal end of the spectrum but struggles elsewhere. More from the AHD:

Traditional grammar requires the nominative form of the pronoun in the predicate of the verb be: It is I (not me); That must be they (not them), and so forth. But in less formal contexts the nominative pronoun can sound pretentious and even ridiculous, especially when the verb is contracted, as in It’s we. The traditional rule creates additional problems when the pronoun following be also functions as the object of a verb or preposition in a relative clause, as in It is not them/they that we have in mind, where the plural pronoun serves as both the predicate of is and the object of have.

Garner also agrees that “it is me and it’s me are fully acceptable, especially in informal contexts”. The resultant alleged lack of “grammatical consistency” causes no harm. One final complaint from Heller:

The same is true of “who” and “whom,” another nominative-accusative pair to which Pinker objects, sort of. He writes, “The best advice to writers is to calibrate their use of ‘whom’ to the complexity of the construction and the degree of formality they desire.” Yet who wants to undertake that calibration all the time? The glorious thing about the “who” and “whom” distinction is that it’s simple.

Whom the hell is this guy trying to kid? It’s easy to calibrate formality to context. We all do it all the time.

(And, as Ben Zimmer reports, even writers as good as Nathan Heller trip themselves up on the ‘who’/‘whom’ distinction.)

Garner also likes the nominative/accusative distinction here, but finds that “in certain contexts, whom is stilted”. The AHD agrees: “in practice whom is uncommon in speech and everyday writing because it has a formal tone”.

So, overall, what lies behind Heller’s objections?

This tendency to add complexity, ambiguity, and doubt is a troubling feature of Pinker’s rules.

This is telling. Heller wants language (and guidance on it) to be simple. That’s admirable, but his insistence on unbending rules can lead to a conception of language that is not just simple but simplistic. While he knows in general that language is complex, he seems to recoil from glimpses of this complexity.

Language is alive and its conventions are diverse: as such a vast category of human behaviour, how could it be otherwise? But to accept that is to abandon the certainties of the rulebook, and some people find that troubling. I know I did, at first. Then I found it exhilarating.

While I’m here: I mostly liked Pinker’s book. In the tradition of Joseph Williams’s Style, Pinker doesn’t just set out a list of dos and don’ts – he takes pains to explain what promotes clarity in writing. He covers how to structure a sentence so it’s easier to understand, how to show logical links between sentences and create a sense of flow through a paragraph, how to fight the “curse of knowledge” and focus on the reader’s perspective rather than your own, and a lot more.

These things matter so much. In my copy editing, I daily come across writing that needs these sorts of principles applied. They are far more important than the piddling little worries – like split infinitives and ‘literally’ and ‘very unique’ and double negatives and Heller’s points above – that dominate public debate about language.

On the negative side, I could have done without all Pinker’s sentence diagrams, which I wasted a lot of time trying to understand before realising the details were barely relevant. And the section near the end, where he lists a load of words that he thinks people should stop misusing, goes against the spirit of the rest of the book by not giving reasons. I would also have liked more detail on some of the academic studies he cites to back up his arguments – I don’t have the time or resources to chase up references.

But on the whole, it’s a damn good book.

Guerrilla copyediting Simon Heffer

This has become by far my most popular tweet ever:

But I need to say a bit more about it.

First of all, I must confess a stupid mistake: I got the book title wrong. This page is from Heffer’s 2011 book Strictly English. Simply English, which came out this year, is largely an adaptation of the earlier book, and this passage from Strictly does appear in Simply, but split into two entries (‘Fillers’ and ‘Redundant words’). Sorry for the mistake. I must have got confused during my googling.

A good copyeditor pays close attention to detail, and here I failed. Muphry’s law strikes again!

Secondly, among the replies I got were several people who thought the book was actually like that. They thought I was endorsing Heffer’s witty graphical way of making his point rather than showing that he ignores his own advice.

So, to clarify: Heffer’s book looks like this. The crossings-out are mine.

What seems obvious to one person may not be obvious to another. The sarcasm of “Well, he’s convinced me” wasn’t clear to everyone.

Thirdly, the actual cuts.

Some people suggested further edits that could be made, and I agree – this was just a quick-and-dirty job to make one point.

Other people suggested some of my cuts went too far. Again, probably true. There are shades of meaning that I might be blurring – though not, I think, important ones.

And not all my cuts were filler as Heffer defines it: I also cut waffle. But enough of them were filler to show that he, like many other writers of language guides, doesn’t follow his own advice.

Whether that’s a fault in his writing or a fault in his advice – or a bit of both – you can judge for yourself. Personally, I don’t think filler words are “unpardonable” and I don’t think you should cut every word that isn’t needed to preserve meaning and clarity. Redundancies may improve the rhythm or allow flashes of the author’s personality to shine through. Some of his sentences simply read better without the kind of cuts he claims to insist on.

And finally: yes, doing something like this is a bit cheeky. But Heffer is a big boy and sales of either of his books vastly exceed my retweets. He’ll be OK.

Same difference

Oliver Kamm ponders an alleged rule of grammar:

NM Gwynne, the author of the bestselling but absurd Gwynne’s Grammar, says: “Sometimes [prepositions] are important simply because to give the wrong preposition is illiterate, as ‘different to something’ is wrong and ‘different from something’ is correct.”

Why is it illiterate to say different to? Gwynne does not say; nor is it true.

I can answer this. At least, I can report Gwynne’s answer, which he gave at a talk earlier this year.

He said that there’s a danger of ambiguity. If we allowed ‘different to’, then a sentence such as ‘He looks very different to me’ would have two possible meanings: his appearance is unlike mine; or his appearance has, in my opinion, changed.

This is silly, for two reasons.

First of all, because so many words have more than one use, there are lots of indisputably correct standard usages that can – in theory – lead to ambiguity. But context usually makes the intended meaning clear, and if it doesn’t then it’s no great effort to rephrase it.

For example, a sentence such as ‘His garden looks very different from mine’ might have two possible meanings: his garden’s appearance is unlike that of mine; or the appearance of his garden changes if you look at it from the perspective that mine offers.

Secondly, the question of ‘allowing’ this usage is laughable. Language pundits have embarrassingly little influence on general usage. We can recommend, but there really is no question of allowing or forbidding. Nobody is in charge of the language. Nobody ever has been.

The idea that ‘different to’ is wrong was first expressed by Robert Baker in 1770, but he was already too late. It is well- and long-established in British English:

  • William Bonde (1526): His lyght is moche different and vnlyke to the lyght of the holygoost.
  • Henry Fielding (1737) ‘Tis a quite different Thing within to what it is without.
  • Thomas Paine (1811): The king, who, very different to the general class called by that name, is a man of good heart…
  • WM Thackeray (1852): The party of prisoners lived…with comforts very different to those which were awarded to the poor wretches there.

True, it has always been less common than ‘different from’: Google Books puts ‘different from’ well ahead of ‘different to’, by about 14 to 1 at the end of the 20th century – but falling sharply. And uncommon doesn’t mean wrong.

More recently, in the UK part of the GloWbE corpus (recording language used on web pages in 2012), ‘different from’ is only modestly ahead of ‘different to’, by about 10,000 to 7,000. What’s more, of the first hundred of the entries for ‘different to’, not a single one uses it in the second way that Gwynne suggests. His fear of ambiguity seems overblown.

Still, some people don’t like it, and that’s worth remembering. Attitudes to usage – like usage itself – need not have a rational basis, but they’re facts all the same.

Despite its critics, though, ‘different to’ is irreversibly a part of the language. That ship has sailed, crossed the ocean, reached its destination, been dismantled, and had its parts used to build houses in which generations of people have lived happy and productive lives.

Update: catteau in the comments reminds me of a good post on this by Stan Carey.

For writers

An editorial response to Hamilton Nolan’s ‘Against editors’.

Nolan has some interesting points to make and some good language to make them with. But the article has a few weaknesses that limit and distort its impact.

Here’s a quick paragraph-by-paragraph summary:

  1. In publishing, writers are unjustly subordinate to editors.
  2. But don’t writers need editors?
  3. Well, OK.
  4. But writers who want career advancement have to become editors.
  5. Writing and editing are different skills, though, so the result is a loss of good writers and an accumulation of bad editors.
  6. Passing a story through a series of editors is wasteful and doesn’t help.
  7. Editing is subjective, and such a series could in theory go on indefinitely.
  8. These unnecessary editors feel the need to edit to justify their existence.
  9. Sure, some editing is necessary, but it’s not as important or as hard as writing – and it’s better-paid.
  10. Let’s keep the good editors but get rid of the needless, bad ones and instead allow writers to get raises and promotions as writers.
  11. Stories are what matter, and stories come from writers, so writers should matter more.

The first problem is near the start. Para 2 includes this imagined response to his opening shot against editors:

You’re one to talk. Your grammar is wronged, your metaphors are blunt bricks, and your similes are like a hot needle to the eyeball. Your infinitives are split, your participles are dangling, your spelling is eroneous, your cliches are old as time, your sentences are repetitive, and your sentences are repetitive.

You see what he’s doing. For example, “old as time” is itself a cliché and so on. But in fact, “old as time” is a short phrase that readers will harmlessly process in barely a second. The real cliché here is the whole paragraph.

An article about editing that makes a string of deliberate blunders in a list of those very blunders? Really? Come on. That technique stopped being cute around the same time Happy Days did. Readers of this article are likely to have read other articles about editing, and so to have seen this trick before.

The other downside of this is that it slows the piece down, just when it needs to pull readers in. After stating his case in only the vaguest of outlines, Nolan is already anticipating objections and hedging with caveats. Para 2 (and by extension para 3) kill his momentum. The caveats could wait until paras 9 and 10.

A second problem is that “editor” covers several very different jobs, and Nolan doesn’t distinguish. Titles and processes vary between publishers, but a crude typology might be:

  • managing editors, who hire and fire, set the overall policy of the publication, and do all sorts of businessy things that have nothing to do with the articles themselves
  • commissioning editors, who decide what articles should be written and who should write them
  • developmental editors, who deal with tone, focus and structure; they may make changes themselves or feed back to the writer
  • copyeditors, who look at clarity, accuracy, consistency, house style, grammar, usage, typos and suchlike.

These roles often overlap: managing editors may commission and may look over articles before publication; commissioning editors may do the developmental work; copyeditors may have licence to veer into developmental territory. But the editors who do the most wrestling with the words are usually not the ones with the money, power and status.

The higher-level editors may well be writers who have climbed the pole. But copyeditors generally didn’t get where they are by working their way up – up! – from being successful writers. So the career hierarchy Nolan sketches and the series of editors he confronts don’t match up.

By not being clearer about what he means, he risks misleading, confusing or annoying readers (depending on how much they already know).

A third weakness comes in para 10:

It is absurd that most writers must choose between a career spent writing and a career that offers raises and promotions.

This point – that writers should be able to make career progress as writers – is crucial. It’s the one thing he really wants. But he gives no detail. He must have been thinking about this for a long time; he must have ideas. Surely he can’t just mean getting paid a bit more as you get a bit better, year after year. What would these senior writing jobs be? I’m intrigued. But after this, all we get in para 11 is a peroration whose rhetorical force depends on blurring the chronology and importance senses of “first”:

Stories are, ultimately, what matter. Stories are what websites and magazines and media “brands” live and die on. Stories come from writers. Writers come first. They shouldn’t be second-class citizens in their own industry.

Nolan could have trimmed that and got rid of the earlier clichéd caveat, freeing up space to elaborate and clarify where it would help. The article would have had more focus, force and depth as a result.

He has things to say (never mind whether I agree) and a talent for saying them, but the extra perspective of an editor could have helped him raise his voice and speak more powerfully to his readers.

Because that’s what editors are really for: we’re for the readers. And we’re for the writers.

Doctor, doctor…

Simon Rich’s ‘guy walks into a bar’ joke is really well done. Because I have little originality and even less shame, I’ve stolen adapted his approach to make my own. Continue reading

Write for a single reader

People sometimes say you should write the way you talk. I see what they’re getting at – be more direct and flowing, less stuffy and formal – but you shouldn’t take that advice too literally.

Writing and speech work very differently. If you’ve ever transcribed a conversation, you’ll know that a lot of the time people don’t even talk in sentences. And anyway, some of us are more fluent in writing than in speech.

Maybe a better version of that tip would be to write in a way that would sound natural if you read it out loud, or to write the way you would talk if you were an Aaron Sorkin character.

But a different way of making the same point struck me recently.

I was doing a piece of editing, and one passage in the text didn’t make sense. It was ambiguously phrased, and I didn’t know enough about the subject to figure out the intended meaning. So I emailed the writer and asked him to help.

His reply consisted of two paragraphs.

The first began “I was trying to make the point that…” and then gave me a perfectly lucid, direct explanation of it. The second began “So I would suggest rewriting it as follows…” and then presented a rewrite that, while clearer than the original, was a good bit stiffer and more overwrought than the explanation he’d just given me.

What had happened was very simple. First he had answered my question, telling me – one person to another – what he wanted me to understand. And then he had gone into Writing Mode.

In Writing Mode, you put aside your ordinary, natural fluency for fear that it isn’t up to the occasion, and you reach for ornate words and sentence structures to self-consciously craft a declaration. But these efforts often just get in the way of communicating.

So my advice is to write for an audience of one single person. Don’t let yourself be daunted by the sense that you’re addressing a crowd.

Sure, you may be aiming for a large readership, but each of them will read as an individual. So write for an individual. Imagine one of them in particular – someone who may well not have the same knowledge and priorities as you – and write directly for that one person.

Try it. Yes, I mean you.

Blog comments policy

  1. Don’t be a dick.
  2. If you’re being a dick, I’ll delete your comment, unless maybe you’re being a really funny dick.
  3. I am the final arbiter of dickery.
  4. I will try not to be a dick myself in deciding whether you’re being a dick. No promises, though.
  5. Get over it. It’s just some guy’s blog, and he’s like kind of a dick. Anyway, do you have any idea how few people read this thing?

Passive aggression

As a guide to good writing, Kellye Crane ranks alongside George Orwell and Stephen King. By which I mean they all make the same mistake.

But before I get onto them, I want to mention William Safire.

In 1979, Safire wrote a list of ‘fumblerules of grammar’ – rules that break themselves. You can get a flavour from the first three:

Remember to never split an infinitive.

A preposition is something never to end a sentence with.

The passive voice should never be used.

And so on.

But the passive-voice fumblerule is real. Stephen King, in his 2001 book On Writing, said: “Adverbs, like the passive voice, seem to have been created with the timid writer in mind.”

“Have been created”? Passive alert! But does this really make King seem timid? I don’t think so. Continue reading


Get every new post delivered to your Inbox.

Join 769 other followers