Physician, explain thyself: science English vs lay English

Doctors and scientists may be brilliant in their fields but some may not be great at communicating their knowledge to the rest of us. This is obviously important when doctors are explaining things to their patients, as it is when research scientists are seeking to engage the public with their work. To succeed, the experts need to tailor their language to a lay audience.

There’s a lot of good advice you can get on how to tell engaging stories about science, and on how to apply principles of clarity and coherence to scientific prose. But, like Vincent Vega in Pulp Fiction, what strikes me most is the little differences between lay English and the English of biomedical science.

I’m going to talk you through a few of the things that I notice again and again as a copyeditor. Biomedical science is my patch, but some of these points will apply to any expert who wants to address an audience of outsiders.

Jargon

Specialist terms are great among specialists, but they can make an outsider’s brain shut down. Everyone knows this, but specialists often underestimate how niche their own jargon is. If you and all your colleagues use a word every day for years, it becomes familiar and obvious. But that doesn’t mean knowledge of it has spread more widely.

For example:

  • A study in the American Journal of Emergency Medicine found that 79% of adult patients in a US hospital emergency department didn’t know that “hemorrhage” was bleeding; 38% didn’t know that “sutures” were stitches; 78% didn’t know that a “fractured” bone is a broken bone (sorry to say, nor did I!).
  • In the UK, the Wellcome Trust Monitor survey of the public found that 48% of adults felt they had a good understanding of the term “DNA” and another 37% some understanding. But only 12% felt they had a good understanding of “human genome” and 14% some understanding; 19% had heard of it but didn’t understand it and 52% had never heard of it.

Steven Pinker writes insightfully about this “curse of knowledge” and how to overcome it. The best trick is to show your writing to an outsider and see what puzzles them. Failing that, you should try to distance yourself from your writing when you review it, and err on the side of caution.

Unless part of your aim is to educate people about specialist terms, avoid them where you can by substituting general descriptions (“parts of a cell with specialised functions” instead of “organelles”) or common names, if they exist (“white blood cell” instead of “leukocyte”). Where you need to use them, introduce them with short, straightforward definitions – and then use them consistently.

The more terms you introduce, the more the reader will struggle – even if you explain each one clearly, newly learned words are harder to keep in mind than old, familiar ones.

I do not think it means what you think it means

Jargon can bewilder people, but they do at least know it when they see it. Potentially more confusing is when everyday words, used in well-known ways by most of us, are used in subtly or completely different ways by scientists (whether across the board or just in certain fields).

For example, in the context of communicating climate science, a paper in Physics Today lists several such words, including:

  • Scientists use “theory” to mean an explanatory framework, but the public use it to mean a hunch or speculation.
  • Scientists use “uncertainty” to mean the range of an estimate, but the public use it to mean ignorance.
  • Scientists use “positive trend” to mean an increase and “positive feedback” to mean a vicious circle, but the public use “positive” to mean good.

In the same vein, a report by the Royal College of General Practitioners cautions that while doctors use “chronic” to mean persistent, patients may think it means severe.

And a few that I’ve come across recently:

  • Scientists may use “determine” to mean influence, but it’s commonly understood to imply more complete control.
  • Scientists use “significant” to mean they’re confident an observation isn’t just chance, but it’s commonly understood to mean important or telling.
  • Scientists may use “fraction” to mean a part, but it commonly has a connotation of being a small part.
  • Scientists and doctors may use “insult” to mean biological damage or injury, but, as the common phrase suggests, insult and injury are widely understood to be different.
  • Doctors often use “trauma” to mean physical damage, but it’s commonly used psychologically.
  • Scientists use “factor” in several specific ways, but to most people it’s vaguer, normally meaning an aspect of a situation that contributes to something happening.
  • Biologists often use “fate” to describe the result of a cell’s development. I like the poetry of this, but for most of us, the word has strong connotations of inevitability and death.
  • Biologists use “analogous” to mean having the same function but a different evolutionary origin; ordinarily, it just means similar to.
  • Scientists may use “predict” to mean give information about or indicate, without the predicted thing necessarily being in the future – the word’s normal meaning. This might make sense from the perspective of a scientist who has yet to examine the predicted thing, but that can confuse lay readers.

Speaking of perspective…

Points of view

Scientists writing about their research in journals rightly take a scientist’s perspective. And science writers who want to tell a story about a scientist’s work can sensibly do the same, although in a different way.

But when the focus is on the subject matter of the science rather than the process of doing it, and when the audience is general, some verbal habits can carry over from journals in little ways – turns of phrase that make prose a bit harder for a lay reader to get into.

For instance, people with a medical condition might be called “patients” or even “cases”.

Being a patient is a limited role rather than an identity. For doctors, who see us in their professional capacity, we definitely are their patients. And we accept that. But after we go home or back to work, that role recedes. To be a patient is typically to be receiving treatment; just having a condition doesn’t quite cut it.

And “cases” is reasonable from an epidemiological point of view, but maybe not if those cases are your readers.

Then there are a few words that shift the perspective from the subject matter to the observations of the people working on it. Writing about “a child showing symptoms of flu” puts the reader in the shoes of the doctor to whom these symptoms were shown. Likewise “presenting” and “displaying”. If you’re writing about what the doctor saw and did, that’s fine. If not, try just “a child with symptoms of flu”.

In with the in crowd

Prepositions are slippery little suckers. These are the words that show relationships between things: on, in, near, under, about, before, during, of, from, to, by…

There are only a couple of hundred prepositions in English, which means they each tend to accumulate a lot of uses. The OED lists 63 definitions of “on”, not counting the archaic or obsolete ones. Which uses go with which prepositions is often arbitrary, and people speaking a second language can easily slip up (French “sur” doesn’t neatly match those 63 English “on”s).

I’ve noticed that in biomedical English, some of the preposition use is different from that in lay English. The biggest difference is with “in”.

I come across descriptions of treatments working in (not for) patients, studies in (not of or on) mosquitoes, drugs used in (not on or for or against) hypertension, bacteria that are deadly in (not to) mice…

Why is this? I think it may be perspective again. If you’re a scientist examining something (a patient, a mosquito, hypertension, a mouse), it may be natural to think of that thing as a container – figuratively or literally – of observable phenomena. So things happen in it. Maybe.

But to laypeople, this sort of language is unusual. It’s a small but distinct sign that a piece of writing isn’t meant for us. And if it is meant for us, that sign is a mistake.

In a similar vein, there are some differences in how articles are used.

Doctors and medical scientists often write things like “prospects for recovery after stroke” while most lay people would say “prospects for recovery after a stroke”. Likewise, compare “damage to visual cortex” and “damage to the visual cortex” (assuming that “visual cortex” isn’t too jargony).

So…

I hope some of this might be helpful. If you have any thoughts on things I could add or change, please let me know in the comments.

Update: I want to stress that while I’ve focused on science and medicine, the problems of insiders talking to outsiders can pop up anywhere. The social sciences, humanities and arts have their own ways of using language. So do people in pretty much any line of work: rocket scientists and brain surgeons, managers and politicians, plumbers and hairdressers.

Has a mechanic ever told you what was wrong with your car while you nodded along dumbly and reached for your wallet?

For any organisation you might you work in, you and your colleagues will know more about it, and probably talk about it differently, than people who don’t (or new starters).

Even socially, if you tell someone an anecdote about a person or situation they don’t know, you may need to fill in some of the background for them to understand it properly.

And I suppose even language bloggers might use too much editing or linguistics jargon from time to time. Sorry about that…

Bad reactions to usage: it cuts both ways

John Rentoul and Oliver Kamm are having a disagreement of some sort about language.

Oliver says that “for Chris and I” is grammatically fine and need not be changed to “for Chris and me” for correctness’ sake:

Pedants take it as a given that the grammatical rule for assigning case is the same for a coordinated (or conjoined) pronoun as it is for a non-coordinated one. Yet that doesn’t follow.

He takes a similar position on many of the prescriptive “rules” of English.

John agrees that there’s nothing strictly ungrammatical about this sort of thing. But he thinks that it’s advisable to avoid it anyway:

My defence of pedantry is twofold. The negative argument is that the writer or speaker should do as little as possible to distract the reader or listener from what is being said. And the positive argument is that it is worth observing the conventions of “correct” spelling and grammar, even if we know that they are arbitrary, because they are markers of quality.

Oliver replies:

I am eager, even zealous, that conventions of Standard English be taught well.

But Standard English is a dialect, not “correct English”. Good writing depends on register, not pedants’ ‘rules’.

They do seem to be disagreeing, but it’s hard to pinpoint exactly what about. Maybe just emphasis?

Both use standard English in their own writing, and both recommend knowledge of standard English to others. Both put “correct” in scare-quotes in this context.

I think John worries that if Oliver goes around insisting that non-standard conventions aren’t wrong, people will use them in standard writing. And I think Oliver worries that if John goes around warning against non-standard usages, people will think non-standard is wrong.

These could both be reasonable worries; a lot of usage debate is painfully simplistic, of the “X is right and Y is wrong” variety. Nuanced positions, in language as in politics, tend to get ignored or caricatured.

I agree with Oliver that people shouldn’t shrink their own linguistic repertoires (and those of others) out of a fear of a mythical rulebook. And I agree with John that people need to have good standard English when it’s the right tool for the job – as it often is. Which of these two points is more important will depend on the situation.

Anyway, they hardly need me to adjudicate.

My response to John’s twofold “defence of pedantry” is that his positive and negative arguments both boil down to this: write in the way that will make your message come across best to your readers.

I agree. But it cuts both ways.

Sure, some people are annoyed by usages that they regard as wrong. And whether they’re right that these are wrong is beside the point: if they react badly, you lose them as readers. However, a rigid insistence on certain rules (or conventions) can make sentences sound bad to other people. A studiously unsplit infinitive can come across as awkward; zealously “whom”ing can seem stuffy. This is a different kind of bad reaction but it can still cost you readers.

It depends on the audience.

You can give yourself a headache trying to work out exactly what’s a matter of correct vs incorrect, what’s formal vs informal, what’s standard vs non-standard, what’s a matter of stylistic taste, and what’s one of the “rules” that finds support from some small-to-medium-sized proportion of standard English speakers.

But if you’re a writer or editor, you don’t really need to: you just need to judge whether the punters will swallow it.

(I’ve written at more length on this before.)

Our survey says…

A YouGov poll showed 1000 Americans some sentences and asked them to pick which one of each pair or group was grammatically correct.

These were the results:

  1. My oak tree loses it’s leaves in autumn. – 31%
    My oak tree loses its leaves in autumn. – 67%
  2. The dogs are happily chewing on they’re bones. – 4%
    The dogs are happily chewing on their bones. – 89%
    The dogs are happily chewing on there bones. – 4%
  3. I think you’re very smart. – 91%
    I think your very smart. – 6%
  4. I consulted an attorney whom I met in New York. – 61%
    I consulted an attorney who I met in New York. – 32%
  5. I don’t trust fruits and vegetables which aren’t organic. – 24%
    I don’t trust fruits and vegetables that aren’t organic. – 68%
  6. The beneficial affects of exercise are well documented. – 17%
    The beneficial effects of exercise are well documented. – 77%

The results are noteworthy, but the poll has two weaknesses.

One would have been painfully easy to fix: adding a “both” option. While 1, 2, 3 and 6 are unarguably testing common mistakes (albeit of punctuation and spelling rather than grammar), 4 and 5 are contested usage issues. Forced to choose one or the other, some people will have leaned towards a vaguely remembered rule that they may not really care about. It would have been good to know how many people think either option is fine.

The second problem is deeper: if you set up a test like this, telling people that there’s only one right answer, their reactions will become unnatural. They will turn their prescriptivism up to maximum and worry about things they would normally happily ignore when reading or in their own usage.

A few remarks on these results:

Far more people got 1 wrong than the similar 2 and 3. This suggests that the lack of an apostrophe in possessive “its” is confusing – aren’t we supposed to use apostrophes to show possession? Pesky illogical English.

From 4, we can see that “whom” isn’t overwhelmingly regarded as necessary for an object. As for the third of people preferring “who” here – I wonder how many were answering as asked, about correctness, and how many were just picking the level of formality they preferred.

The answers to 5 show that the campaign to ban restrictive “which” has made an impression on people (at least in the US). But again I stress that we don’t know whether these answers reflect actual usage and firmly held opinions or just “oh yeah, I think I’ve heard that you should use ‘that’”.

Ask a stupid question

Geoffrey Pullum sifts through the contents of alleged “grammar” quizzes. He is scathing:

if quizzes on chemistry were as uninformed as those on grammar, they would ask silly questions on peripheral topics (“Who is the Bunsen burner named after?”), and would make no reference to the periodic table, or atoms or molecules. The web’s grammar quizzes deal in minor pieces of puristic flotsam with roots in dimly understood 18th-century grammatical analysis.

Why are these quizzes so bad? He rightly blames the incompetence of the setters and their obsession with “fiddly details, gotcha points, and timeworn myths”. But there’s a problem on the other side too: while a proper linguist like Pullum could write a proper test on the features of English grammar, most of the people who took it would struggle. He notes that “few educated people know anything about grammatical analysis”.

We are, most of us, lacking in the theory of grammar. We might vaguely remember a few schoolroom basics, like “a noun is the name of a person, place or thing”. But this is about as useful a definition as “a bird is an animal that flies”. Sure, it identifies the prototypical cases, and it can get a child into the right sort of territory to begin to understand, but the truth is much more complex.

Look at the underlined nouns in this example of Pullum’s:

There’s a great deal of unclarity about what the heck would happen in a crisis during your absence, and there’s every chance of that, for heaven’s sake!

No meaningful definition of “thing” can accommodate these. Instead, you have to define grammatical categories by their grammatical properties. But this knowledge is uncommon.

On the other hand, if you want to test people’s practical grasp of grammar – well, that’s easy. But pretty pointless.

Recoiling from complexity

Nathan Heller has critically reviewed Steven Pinker’s The Sense of Style. His main complaint is that Pinker’s advice includes too much leeway based on context and subjective judgement and too little respect for the “basic principles of English grammar”. For example:

English is complex. To help reduce ambiguity, modern usage attaches specific words to specific functions. The restrictive-nonrestrictive division between “that” and “which”—two particularly common and shifty words—is one attempt at clarity.

Heller cites “usage”. To remind you: usage is what people do with language. So, do people use ‘that’ to open restrictive clauses and ‘which’ to open nonrestrictive ones? Some do, but many don’t.

A great little study by Jonathon Owen, for his Master’s thesis at Brigham Young University, looked at the kinds of changes copy editors make. He reviewed edits made to manuscripts written by academics and found one of the commonest to be imposing the “the that/which rule, proscribing the use of which as a restrictive relative pronoun”. This tells us that this distinction may be widely observed by copy editors but less so by even educated writers of formal prose – and presumably less still by the public as a whole.

The American Heritage Dictionary (AHD) goes further, noting that “this use of which with restrictive clauses is very common, even in edited prose”.

So yes, people have attempted to create this distinction: EB White recommended it, so did HW Fowler, so does Garner’s Modern American Usage. And dedicated copy editors have done their best to make usage follow this distinction (including me, until I wised up). But usage persists in going its own way (Garner concedes that restrictive ‘which’ is ubiquitous). And no lack of clarity has resulted, because we use commas to highlight nonrestrictive clauses.

Heller has further concerns:

Another is the rule that “like” joins noun phrases, while “as” or “as if” is for verb phrases. (“It looks as if my date is here!” “You look like Mom in that dress.”) Pinker doesn’t see the point of that one, either.

His grammar terms aren’t quite right here: “my date is here” is a clause, not a verb phrase; “is here” is the verb phrase.

Again, this is a rule insofar as people like Heller insist on it, but by the standards of usage there’s far more leeway, depending on the situation. The AHD again:

They don’t make them like they used to. I remember it like it was yesterday. As these familiar examples show, like is often used as a conjunction meaning “as” or “as if,” particularly in speech. While writers since Chaucer’s time have used like as a conjunction, the usage today has a somewhat informal or conversational flavor. Language critics and writing handbooks have condemned the conjunctive use of like for more than a century, and in accordance with this tradition, like is usually edited out of more formal prose.

But it adds:

Like is acceptable at all levels as a conjunction when used with verbs such as feel, look, seem, sound, and taste: It looks like we are in for a rough winter.

And Garner accepts that conjunctive ‘like’ is “acceptable casual English”, although not yet “unimpeachable”. Again, genuine confusions caused by using ‘like’ in this way are vanishingly rare. Heller raises another issue:

It’s for grammatical consistency, not beauty or gentilesse, for example, that correct English has us say “It was he” instead of “It was him.” Pinker calls this offense “a schoolteacher rule” that is “a product of the usual three confusions: English with Latin, informal style with incorrect grammar, and syntax with semantics.” … [But] the nominative and accusative cases are the reason that we don’t say gibberish like “Her gave it to he and then sat by we here!” No idea is more basic to English syntax and grammar. In the phrase “It was he,” “it” and “he” are the same thing: they’re both the subject, and thus nominative.

I agree: it definitely isn’t for beauty that anyone says “It was he”. It sounds so prissy I can hardly bear to type it.

And Heller is right about the difference between nominative and accusative, but then he misapplies it because of another grammatical misunderstanding. In “It was he”, “It” and “he” are not both the subject; “he” is the object.

You can see this more clearly in the grammatically equivalent “It was they”. Subjects govern verbs, and here, the singular subject “it” gives us the singular form of the verb “was”. Then “they”, being plural, is obviously not the subject. Heller’s mistake is precisely the confusion of syntax with semantics that he quotes Pinker warning him against. The two pronouns have the same referent but not the same grammatical role.

All this nit-picking is cracking good fun, but it’s beside the point. According to usage, “It was he” lives happily enough at the formal end of the spectrum but struggles elsewhere. More from the AHD:

Traditional grammar requires the nominative form of the pronoun in the predicate of the verb be: It is I (not me); That must be they (not them), and so forth. But in less formal contexts the nominative pronoun can sound pretentious and even ridiculous, especially when the verb is contracted, as in It’s we. The traditional rule creates additional problems when the pronoun following be also functions as the object of a verb or preposition in a relative clause, as in It is not them/they that we have in mind, where the plural pronoun serves as both the predicate of is and the object of have.

Garner also agrees that “it is me and it’s me are fully acceptable, especially in informal contexts”. The resultant alleged lack of “grammatical consistency” causes no harm. One final complaint from Heller:

The same is true of “who” and “whom,” another nominative-accusative pair to which Pinker objects, sort of. He writes, “The best advice to writers is to calibrate their use of ‘whom’ to the complexity of the construction and the degree of formality they desire.” Yet who wants to undertake that calibration all the time? The glorious thing about the “who” and “whom” distinction is that it’s simple.

Whom the hell is this guy trying to kid? It’s easy to calibrate formality to context. We all do it all the time.

(And, as Ben Zimmer reports, even writers as good as Nathan Heller trip themselves up on the ‘who’/‘whom’ distinction.)

Garner also likes the nominative/accusative distinction here, but finds that “in certain contexts, whom is stilted”. The AHD agrees: “in practice whom is uncommon in speech and everyday writing because it has a formal tone”.

So, overall, what lies behind Heller’s objections?

This tendency to add complexity, ambiguity, and doubt is a troubling feature of Pinker’s rules.

This is telling. Heller wants language (and guidance on it) to be simple. That’s admirable, but his insistence on unbending rules can lead to a conception of language that is not just simple but simplistic. While he knows in general that language is complex, he seems to recoil from glimpses of this complexity.

Language is alive and its conventions are diverse: as such a vast category of human behaviour, how could it be otherwise? But to accept that is to abandon the certainties of the rulebook, and some people find that troubling. I know I did, at first. Then I found it exhilarating.

While I’m here: I mostly liked Pinker’s book. In the tradition of Joseph Williams’s Style, Pinker doesn’t just set out a list of dos and don’ts – he takes pains to explain what promotes clarity in writing. He covers how to structure a sentence so it’s easier to understand, how to show logical links between sentences and create a sense of flow through a paragraph, how to fight the “curse of knowledge” and focus on the reader’s perspective rather than your own, and a lot more.

These things matter so much. In my copy editing, I daily come across writing that needs these sorts of principles applied. They are far more important than the piddling little worries – like split infinitives and ‘literally’ and ‘very unique’ and double negatives and Heller’s points above – that dominate public debate about language.

On the negative side, I could have done without all Pinker’s sentence diagrams, which I wasted a lot of time trying to understand before realising the details were barely relevant. And the section near the end, where he lists a load of words that he thinks people should stop misusing, goes against the spirit of the rest of the book by not giving reasons. I would also have liked more detail on some of the academic studies he cites to back up his arguments – I don’t have the time or resources to chase up references.

But on the whole, it’s a damn good book.

Guerrilla copyediting Simon Heffer

This has become by far my most popular tweet ever:

But I need to say a bit more about it.

First of all, I must confess a stupid mistake: I got the book title wrong. This page is from Heffer’s 2011 book Strictly English. Simply English, which came out this year, is largely an adaptation of the earlier book, and this passage from Strictly does appear in Simply, but split into two entries (‘Fillers’ and ‘Redundant words’). Sorry for the mistake. I must have got confused during my googling.

A good copyeditor pays close attention to detail, and here I failed. Muphry’s law strikes again!

Secondly, among the replies I got were several people who thought the book was actually like that. They thought I was endorsing Heffer’s witty graphical way of making his point rather than showing that he ignores his own advice.

So, to clarify: Heffer’s book looks like this. The crossings-out are mine.

What seems obvious to one person may not be obvious to another. The sarcasm of “Well, he’s convinced me” wasn’t clear to everyone.

Thirdly, the actual cuts.

Some people suggested further edits that could be made, and I agree – this was just a quick-and-dirty job to make one point.

Other people suggested some of my cuts went too far. Again, probably true. There are shades of meaning that I might be blurring – though not, I think, important ones.

And not all my cuts were filler as Heffer defines it: I also cut waffle. But enough of them were filler to show that he, like many other writers of language guides, doesn’t follow his own advice.

Whether that’s a fault in his writing or a fault in his advice – or a bit of both – you can judge for yourself. Personally, I don’t think filler words are “unpardonable” and I don’t think you should cut every word that isn’t needed to preserve meaning and clarity. Redundancies may improve the rhythm or allow flashes of the author’s personality to shine through. Some of his sentences simply read better without the kind of cuts he claims to insist on.

And finally: yes, doing something like this is a bit cheeky. But Heffer is a big boy and sales of either of his books vastly exceed my retweets. He’ll be OK.

Same difference

Oliver Kamm ponders an alleged rule of grammar:

NM Gwynne, the author of the bestselling but absurd Gwynne’s Grammar, says: “Sometimes [prepositions] are important simply because to give the wrong preposition is illiterate, as ‘different to something’ is wrong and ‘different from something’ is correct.”

Why is it illiterate to say different to? Gwynne does not say; nor is it true.

I can answer this. At least, I can report Gwynne’s answer, which he gave at a talk earlier this year.

He said that there’s a danger of ambiguity. If we allowed ‘different to’, then a sentence such as ‘He looks very different to me’ would have two possible meanings: his appearance is unlike mine; or his appearance has, in my opinion, changed.

This is silly, for two reasons.

First of all, because so many words have more than one use, there are lots of indisputably correct standard usages that can – in theory – lead to ambiguity. But context usually makes the intended meaning clear, and if it doesn’t then it’s no great effort to rephrase it.

For example, a sentence such as ‘His garden looks very different from mine’ might have two possible meanings: his garden’s appearance is unlike that of mine; or the appearance of his garden changes if you look at it from the perspective that mine offers.

Secondly, the question of ‘allowing’ this usage is laughable. Language pundits have embarrassingly little influence on general usage. We can recommend, but there really is no question of allowing or forbidding. Nobody is in charge of the language. Nobody ever has been.

The idea that ‘different to’ is wrong was first expressed by Robert Baker in 1770, but he was already too late. It is well- and long-established in British English:

  • William Bonde (1526): His lyght is moche different and vnlyke to the lyght of the holygoost.
  • Henry Fielding (1737) ‘Tis a quite different Thing within to what it is without.
  • Thomas Paine (1811): The king, who, very different to the general class called by that name, is a man of good heart…
  • WM Thackeray (1852): The party of prisoners lived…with comforts very different to those which were awarded to the poor wretches there.

True, it has always been less common than ‘different from’: Google Books puts ‘different from’ well ahead of ‘different to’, by about 14 to 1 at the end of the 20th century – but falling sharply. And uncommon doesn’t mean wrong.

More recently, in the UK part of the GloWbE corpus (recording language used on web pages in 2012), ‘different from’ is only modestly ahead of ‘different to’, by about 10,000 to 7,000. What’s more, of the first hundred of the entries for ‘different to’, not a single one uses it in the second way that Gwynne suggests. His fear of ambiguity seems overblown.

Still, some people don’t like it, and that’s worth remembering. Attitudes to usage – like usage itself – need not have a rational basis, but they’re facts all the same.

Despite its critics, though, ‘different to’ is irreversibly a part of the language. That ship has sailed, crossed the ocean, reached its destination, been dismantled, and had its parts used to build houses in which generations of people have lived happy and productive lives.

Update: catteau in the comments reminds me of a good post on this by Stan Carey.

For writers

An editorial response to Hamilton Nolan’s ‘Against editors’.

Nolan has some interesting points to make and some good language to make them with. But the article has a few weaknesses that limit and distort its impact.

Here’s a quick paragraph-by-paragraph summary:

  1. In publishing, writers are unjustly subordinate to editors.
  2. But don’t writers need editors?
  3. Well, OK.
  4. But writers who want career advancement have to become editors.
  5. Writing and editing are different skills, though, so the result is a loss of good writers and an accumulation of bad editors.
  6. Passing a story through a series of editors is wasteful and doesn’t help.
  7. Editing is subjective, and such a series could in theory go on indefinitely.
  8. These unnecessary editors feel the need to edit to justify their existence.
  9. Sure, some editing is necessary, but it’s not as important or as hard as writing – and it’s better-paid.
  10. Let’s keep the good editors but get rid of the needless, bad ones and instead allow writers to get raises and promotions as writers.
  11. Stories are what matter, and stories come from writers, so writers should matter more.

The first problem is near the start. Para 2 includes this imagined response to his opening shot against editors:

You’re one to talk. Your grammar is wronged, your metaphors are blunt bricks, and your similes are like a hot needle to the eyeball. Your infinitives are split, your participles are dangling, your spelling is eroneous, your cliches are old as time, your sentences are repetitive, and your sentences are repetitive.

You see what he’s doing. For example, “old as time” is itself a cliché and so on. But in fact, “old as time” is a short phrase that readers will harmlessly process in barely a second. The real cliché here is the whole paragraph.

An article about editing that makes a string of deliberate blunders in a list of those very blunders? Really? Come on. That technique stopped being cute around the same time Happy Days did. Readers of this article are likely to have read other articles about editing, and so to have seen this trick before.

The other downside of this is that it slows the piece down, just when it needs to pull readers in. After stating his case in only the vaguest of outlines, Nolan is already anticipating objections and hedging with caveats. Para 2 (and by extension para 3) kill his momentum. The caveats could wait until paras 9 and 10.

A second problem is that “editor” covers several very different jobs, and Nolan doesn’t distinguish. Titles and processes vary between publishers, but a crude typology might be:

  • managing editors, who hire and fire, set the overall policy of the publication, and do all sorts of businessy things that have nothing to do with the articles themselves
  • commissioning editors, who decide what articles should be written and who should write them
  • developmental editors, who deal with tone, focus and structure; they may make changes themselves or feed back to the writer
  • copyeditors, who look at clarity, accuracy, consistency, house style, grammar, usage, typos and suchlike.

These roles often overlap: managing editors may commission and may look over articles before publication; commissioning editors may do the developmental work; copyeditors may have licence to veer into developmental territory. But the editors who do the most wrestling with the words are usually not the ones with the money, power and status.

The higher-level editors may well be writers who have climbed the pole. But copyeditors generally didn’t get where they are by working their way up – up! – from being successful writers. So the career hierarchy Nolan sketches and the series of editors he confronts don’t match up.

By not being clearer about what he means, he risks misleading, confusing or annoying readers (depending on how much they already know).

A third weakness comes in para 10:

It is absurd that most writers must choose between a career spent writing and a career that offers raises and promotions.

This point – that writers should be able to make career progress as writers – is crucial. It’s the one thing he really wants. But he gives no detail. He must have been thinking about this for a long time; he must have ideas. Surely he can’t just mean getting paid a bit more as you get a bit better, year after year. What would these senior writing jobs be? I’m intrigued. But after this, all we get in para 11 is a peroration whose rhetorical force depends on blurring the chronology and importance senses of “first”:

Stories are, ultimately, what matter. Stories are what websites and magazines and media “brands” live and die on. Stories come from writers. Writers come first. They shouldn’t be second-class citizens in their own industry.

Nolan could have trimmed that and got rid of the earlier clichéd caveat, freeing up space to elaborate and clarify where it would help. The article would have had more focus, force and depth as a result.

He has things to say (never mind whether I agree) and a talent for saying them, but the extra perspective of an editor could have helped him raise his voice and speak more powerfully to his readers.

Because that’s what editors are really for: we’re for the readers. And we’re for the writers.

Doctor, doctor…

Simon Rich’s ‘guy walks into a bar’ joke is really well done. Because I have little originality and even less shame, I’ve stolen adapted his approach to make my own. Continue reading

Write for a single reader

People sometimes say you should write the way you talk. I see what they’re getting at – be more direct and flowing, less stuffy and formal – but you shouldn’t take that advice too literally.

Writing and speech work very differently. If you’ve ever transcribed a conversation, you’ll know that a lot of the time people don’t even talk in sentences. And anyway, some of us are more fluent in writing than in speech.

Maybe a better version of that tip would be to write in a way that would sound natural if you read it out loud, or to write the way you would talk if you were an Aaron Sorkin character.

But a different way of making the same point struck me recently.

I was doing a piece of editing, and one passage in the text didn’t make sense. It was ambiguously phrased, and I didn’t know enough about the subject to figure out the intended meaning. So I emailed the writer and asked him to help.

His reply consisted of two paragraphs.

The first began “I was trying to make the point that…” and then gave me a perfectly lucid, direct explanation of it. The second began “So I would suggest rewriting it as follows…” and then presented a rewrite that, while clearer than the original, was a good bit stiffer and more overwrought than the explanation he’d just given me.

What had happened was very simple. First he had answered my question, telling me – one person to another – what he wanted me to understand. And then he had gone into Writing Mode.

In Writing Mode, you put aside your ordinary, natural fluency for fear that it isn’t up to the occasion, and you reach for ornate words and sentence structures to self-consciously craft a declaration. But these efforts often just get in the way of communicating.

So my advice is to write for an audience of one single person. Don’t let yourself be daunted by the sense that you’re addressing a crowd.

Sure, you may be aiming for a large readership, but each of them will read as an individual. So write for an individual. Imagine one of them in particular – someone who may well not have the same knowledge and priorities as you – and write directly for that one person.

Try it. Yes, I mean you.

Follow

Get every new post delivered to your Inbox.

Join 843 other followers