Monthly Archives: November 2014

Cock stars and unparliamentary language

mordauntThe Mail on Sunday reports that Penny Mordaunt MP used the word “cock” in Parliament as the result of a bet.

The alleged newspaper describes how she crafted a speech about poultry, supporting ”the cause of hen and cock welfare”.

Or, as they put it:

Communities Minister Penny Mordaunt said ‘c**k’ six times

I’ve often doubted the point of using asterisks for swearwords. The word still makes its journey from the writer’s head to the reader’s. The communication happens as normal because the intent is clear.

I suppose the usual reason is to protect any young children who might see it – but I wonder whether that really works.

In this case, writing “c**k” instead of “cock” seems doubly silly, and not just because it’s a fairly mild swearword. (You’ll note that they didn’t put “c***” – that is, they printed enough to make the word perfectly clear.)

The whole point of Mordaunt’s speech, and of any such innuendo, is that the word used is nominally innocent: a cock is a male bird. Nobody would censor that. But the Mail has decided that, because of Mordaunt’s secret intent while saying it, it cannot be printed.

So here, the Mail has judged that intent is reason to censor a word, but to do so in a way that leaves that intent clear. Daft.

In passing, I note that the article begins: “A female Tory Minister…” The Mail still thinks it necessary to point out that a woman doing a Proper Job is a woman, as if her name and the huge photo of her aren’t enough to make that clear. That, rather than a minor piece of parliamentary childishness or any amount of cockery, is what dismays me.

Physician, explain thyself: science English vs lay English

Doctors and scientists may be brilliant in their fields but some may not be great at communicating their knowledge to the rest of us. This is obviously important when doctors are explaining things to their patients, as it is when research scientists are seeking to engage the public with their work. To succeed, the experts need to tailor their language to a lay audience.

There’s a lot of good advice you can get on how to tell engaging stories about science, and on how to apply principles of clarity and coherence to scientific prose. But, like Vincent Vega in Pulp Fiction, what strikes me most is the little differences between lay English and the English of biomedical science.

I’m going to talk you through a few of the things that I notice again and again as a copyeditor. Biomedical science is my patch, but some of these points will apply to any expert who wants to address an audience of outsiders.

Jargon

Specialist terms are great among specialists, but they can make an outsider’s brain shut down. Everyone knows this, but specialists often underestimate how niche their own jargon is. If you and all your colleagues use a word every day for years, it becomes familiar and obvious. But that doesn’t mean knowledge of it has spread more widely.

For example:

  • A study in the American Journal of Emergency Medicine found that 79% of adult patients in a US hospital emergency department didn’t know that “hemorrhage” was bleeding; 38% didn’t know that “sutures” were stitches; 78% didn’t know that a “fractured” bone is a broken bone (sorry to say, nor did I!).
  • In the UK, the Wellcome Trust Monitor survey of the public found that 48% of adults felt they had a good understanding of the term “DNA” and another 37% some understanding. But only 12% felt they had a good understanding of “human genome” and 14% some understanding; 19% had heard of it but didn’t understand it and 52% had never heard of it.

Steven Pinker writes insightfully about this “curse of knowledge” and how to overcome it. The best trick is to show your writing to an outsider and see what puzzles them. Failing that, you should try to distance yourself from your writing when you review it, and err on the side of caution.

Unless part of your aim is to educate people about specialist terms, avoid them where you can by substituting general descriptions (“parts of a cell with specialised functions” instead of “organelles”) or common names, if they exist (“white blood cell” instead of “leukocyte”). Where you need to use them, introduce them with short, straightforward definitions – and then use them consistently.

The more terms you introduce, the more the reader will struggle – even if you explain each one clearly, newly learned words are harder to keep in mind than old, familiar ones.

I do not think it means what you think it means

Jargon can bewilder people, but they do at least know it when they see it. Potentially more confusing is when everyday words, used in well-known ways by most of us, are used in subtly or completely different ways by scientists (whether across the board or just in certain fields).

For example, in the context of communicating climate science, a paper in Physics Today lists several such words, including:

  • Scientists use “theory” to mean an explanatory framework, but the public use it to mean a hunch or speculation.
  • Scientists use “uncertainty” to mean the range of an estimate, but the public use it to mean ignorance.
  • Scientists use “positive trend” to mean an increase and “positive feedback” to mean a vicious circle, but the public use “positive” to mean good.

In the same vein, a report by the Royal College of General Practitioners cautions that while doctors use “chronic” to mean persistent, patients may think it means severe.

And a few that I’ve come across recently:

  • Scientists may use “determine” to mean influence, but it’s commonly understood to imply more complete control.
  • Scientists use “significant” to mean they’re confident an observation isn’t just chance, but it’s commonly understood to mean important or telling.
  • Scientists may use “fraction” to mean a part, but it commonly has a connotation of being a small part.
  • Scientists and doctors may use “insult” to mean biological damage or injury, but, as the common phrase suggests, insult and injury are widely understood to be different.
  • Doctors often use “trauma” to mean physical damage, but it’s commonly used psychologically.
  • Scientists use “factor” in several specific ways, but to most people it’s vaguer, normally meaning an aspect of a situation that contributes to something happening.
  • Biologists often use “fate” to describe the result of a cell’s development. I like the poetry of this, but for most of us, the word has strong connotations of inevitability and death.
  • Biologists use “analogous” to mean having the same function but a different evolutionary origin; ordinarily, it just means similar to.
  • Scientists may use “predict” to mean give information about or indicate, without the predicted thing necessarily being in the future – the word’s normal meaning. This might make sense from the perspective of a scientist who has yet to examine the predicted thing, but that can confuse lay readers.

Speaking of perspective…

Points of view

Scientists writing about their research in journals rightly take a scientist’s perspective. And science writers who want to tell a story about a scientist’s work can sensibly do the same, although in a different way.

But when the focus is on the subject matter of the science rather than the process of doing it, and when the audience is general, some verbal habits can carry over from journals in little ways – turns of phrase that make prose a bit harder for a lay reader to get into.

For instance, people with a medical condition might be called “patients” or even “cases”.

Being a patient is a limited role rather than an identity. For doctors, who see us in their professional capacity, we definitely are their patients. And we accept that. But after we go home or back to work, that role recedes. To be a patient is typically to be receiving treatment; just having a condition doesn’t quite cut it.

And “cases” is reasonable from an epidemiological point of view, but maybe not if those cases are your readers.

Then there are a few words that shift the perspective from the subject matter to the observations of the people working on it. Writing about “a child showing symptoms of flu” puts the reader in the shoes of the doctor to whom these symptoms were shown. Likewise “presenting” and “displaying”. If you’re writing about what the doctor saw and did, that’s fine. If not, try just “a child with symptoms of flu”.

In with the in crowd

Prepositions are slippery little suckers. These are the words that show relationships between things: on, in, near, under, about, before, during, of, from, to, by…

There are only a couple of hundred prepositions in English, which means they each tend to accumulate a lot of uses. The OED lists 63 definitions of “on”, not counting the archaic or obsolete ones. Which uses go with which prepositions is often arbitrary, and people speaking a second language can easily slip up (French “sur” doesn’t neatly match those 63 English “on”s).

I’ve noticed that in biomedical English, some of the preposition use is different from that in lay English. The biggest difference is with “in”.

I come across descriptions of treatments working in (not for) patients, studies in (not of or on) mosquitoes, drugs used in (not on or for or against) hypertension, bacteria that are deadly in (not to) mice…

Why is this? I think it may be perspective again. If you’re a scientist examining something (a patient, a mosquito, hypertension, a mouse), it may be natural to think of that thing as a container – figuratively or literally – of observable phenomena. So things happen in it. Maybe.

But to laypeople, this sort of language is unusual. It’s a small but distinct sign that a piece of writing isn’t meant for us. And if it is meant for us, that sign is a mistake.

In a similar vein, there are some differences in how articles are used.

Doctors and medical scientists often write things like “prospects for recovery after stroke” while most lay people would say “prospects for recovery after a stroke”. Likewise, compare “damage to visual cortex” and “damage to the visual cortex” (assuming that “visual cortex” isn’t too jargony).

So…

I hope some of this might be helpful. If you have any thoughts on things I could add or change, please let me know in the comments.

Update: I want to stress that while I’ve focused on science and medicine, the problems of insiders talking to outsiders can pop up anywhere. The social sciences, humanities and arts have their own ways of using language. So do people in pretty much any line of work: rocket scientists and brain surgeons, managers and politicians, plumbers and hairdressers.

Has a mechanic ever told you what was wrong with your car while you nodded along dumbly and reached for your wallet?

For any organisation you might you work in, you and your colleagues will know more about it, and probably talk about it differently, than people who don’t (or new starters).

Even socially, if you tell someone an anecdote about a person or situation they don’t know, you may need to fill in some of the background for them to understand it properly.

And I suppose even language bloggers might use too much editing or linguistics jargon from time to time. Sorry about that…

Bad reactions to usage: it cuts both ways

John Rentoul and Oliver Kamm are having a disagreement of some sort about language.

Oliver says that “for Chris and I” is grammatically fine and need not be changed to “for Chris and me” for correctness’ sake:

Pedants take it as a given that the grammatical rule for assigning case is the same for a coordinated (or conjoined) pronoun as it is for a non-coordinated one. Yet that doesn’t follow.

He takes a similar position on many of the prescriptive “rules” of English.

John agrees that there’s nothing strictly ungrammatical about this sort of thing. But he thinks that it’s advisable to avoid it anyway:

My defence of pedantry is twofold. The negative argument is that the writer or speaker should do as little as possible to distract the reader or listener from what is being said. And the positive argument is that it is worth observing the conventions of “correct” spelling and grammar, even if we know that they are arbitrary, because they are markers of quality.

Oliver replies:

I am eager, even zealous, that conventions of Standard English be taught well.

But Standard English is a dialect, not “correct English”. Good writing depends on register, not pedants’ ‘rules’.

They do seem to be disagreeing, but it’s hard to pinpoint exactly what about. Maybe just emphasis?

Both use standard English in their own writing, and both recommend knowledge of standard English to others. Both put “correct” in scare-quotes in this context.

I think John worries that if Oliver goes around insisting that non-standard conventions aren’t wrong, people will use them in standard writing. And I think Oliver worries that if John goes around warning against non-standard usages, people will think non-standard is wrong.

These could both be reasonable worries; a lot of usage debate is painfully simplistic, of the “X is right and Y is wrong” variety. Nuanced positions, in language as in politics, tend to get ignored or caricatured.

I agree with Oliver that people shouldn’t shrink their own linguistic repertoires (and those of others) out of a fear of a mythical rulebook. And I agree with John that people need to have good standard English when it’s the right tool for the job – as it often is. Which of these two points is more important will depend on the situation.

Anyway, they hardly need me to adjudicate.

My response to John’s twofold “defence of pedantry” is that his positive and negative arguments both boil down to this: write in the way that will make your message come across best to your readers.

I agree. But it cuts both ways.

Sure, some people are annoyed by usages that they regard as wrong. And whether they’re right that these are wrong is beside the point: if they react badly, you lose them as readers. However, a rigid insistence on certain rules (or conventions) can make sentences sound bad to other people. A studiously unsplit infinitive can come across as awkward; zealously “whom”ing can seem stuffy. This is a different kind of bad reaction but it can still cost you readers.

It depends on the audience.

You can give yourself a headache trying to work out exactly what’s a matter of correct vs incorrect, what’s formal vs informal, what’s standard vs non-standard, what’s a matter of stylistic taste, and what’s one of the “rules” that finds support from some small-to-medium-sized proportion of standard English speakers.

But if you’re a writer or editor, you don’t really need to: you just need to judge whether the punters will swallow it.

(I’ve written at more length on this before.)

Our survey says…

A YouGov poll showed 1000 Americans some sentences and asked them to pick which one of each pair or group was grammatically correct.

These were the results:

  1. My oak tree loses it’s leaves in autumn. – 31%
    My oak tree loses its leaves in autumn. – 67%
  2. The dogs are happily chewing on they’re bones. – 4%
    The dogs are happily chewing on their bones. – 89%
    The dogs are happily chewing on there bones. – 4%
  3. I think you’re very smart. – 91%
    I think your very smart. – 6%
  4. I consulted an attorney whom I met in New York. – 61%
    I consulted an attorney who I met in New York. – 32%
  5. I don’t trust fruits and vegetables which aren’t organic. – 24%
    I don’t trust fruits and vegetables that aren’t organic. – 68%
  6. The beneficial affects of exercise are well documented. – 17%
    The beneficial effects of exercise are well documented. – 77%

The results are noteworthy, but the poll has two weaknesses.

One would have been painfully easy to fix: adding a “both” option. While 1, 2, 3 and 6 are unarguably testing common mistakes (albeit of punctuation and spelling rather than grammar), 4 and 5 are contested usage issues. Forced to choose one or the other, some people will have leaned towards a vaguely remembered rule that they may not really care about. It would have been good to know how many people think either option is fine.

The second problem is deeper: if you set up a test like this, telling people that there’s only one right answer, their reactions will become unnatural. They will turn their prescriptivism up to maximum and worry about things they would normally happily ignore when reading or in their own usage.

A few remarks on these results:

Far more people got 1 wrong than the similar 2 and 3. This suggests that the lack of an apostrophe in possessive “its” is confusing – aren’t we supposed to use apostrophes to show possession? Pesky illogical English.

From 4, we can see that “whom” isn’t overwhelmingly regarded as necessary for an object. As for the third of people preferring “who” here – I wonder how many were answering as asked, about correctness, and how many were just picking the level of formality they preferred.

The answers to 5 show that the campaign to ban restrictive “which” has made an impression on people (at least in the US). But again I stress that we don’t know whether these answers reflect actual usage and firmly held opinions or just “oh yeah, I think I’ve heard that you should use ‘that’”.

Ask a stupid question

Geoffrey Pullum sifts through the contents of alleged “grammar” quizzes. He is scathing:

if quizzes on chemistry were as uninformed as those on grammar, they would ask silly questions on peripheral topics (“Who is the Bunsen burner named after?”), and would make no reference to the periodic table, or atoms or molecules. The web’s grammar quizzes deal in minor pieces of puristic flotsam with roots in dimly understood 18th-century grammatical analysis.

Why are these quizzes so bad? He rightly blames the incompetence of the setters and their obsession with “fiddly details, gotcha points, and timeworn myths”. But there’s a problem on the other side too: while a proper linguist like Pullum could write a proper test on the features of English grammar, most of the people who took it would struggle. He notes that “few educated people know anything about grammatical analysis”.

We are, most of us, lacking in the theory of grammar. We might vaguely remember a few schoolroom basics, like “a noun is the name of a person, place or thing”. But this is about as useful a definition as “a bird is an animal that flies”. Sure, it identifies the prototypical cases, and it can get a child into the right sort of territory to begin to understand, but the truth is much more complex.

Look at the underlined nouns in this example of Pullum’s:

There’s a great deal of unclarity about what the heck would happen in a crisis during your absence, and there’s every chance of that, for heaven’s sake!

No meaningful definition of “thing” can accommodate these. Instead, you have to define grammatical categories by their grammatical properties. But this knowledge is uncommon.

On the other hand, if you want to test people’s practical grasp of grammar – well, that’s easy. But pretty pointless.

Recoiling from complexity

Nathan Heller has critically reviewed Steven Pinker’s The Sense of Style. His main complaint is that Pinker’s advice includes too much leeway based on context and subjective judgement and too little respect for the “basic principles of English grammar”. For example:

English is complex. To help reduce ambiguity, modern usage attaches specific words to specific functions. The restrictive-nonrestrictive division between “that” and “which”—two particularly common and shifty words—is one attempt at clarity.

Heller cites “usage”. To remind you: usage is what people do with language. So, do people use ‘that’ to open restrictive clauses and ‘which’ to open nonrestrictive ones? Some do, but many don’t.

A great little study by Jonathon Owen, for his Master’s thesis at Brigham Young University, looked at the kinds of changes copy editors make. He reviewed edits made to manuscripts written by academics and found one of the commonest to be imposing the “the that/which rule, proscribing the use of which as a restrictive relative pronoun”. This tells us that this distinction may be widely observed by copy editors but less so by even educated writers of formal prose – and presumably less still by the public as a whole.

The American Heritage Dictionary (AHD) goes further, noting that “this use of which with restrictive clauses is very common, even in edited prose”.

So yes, people have attempted to create this distinction: EB White recommended it, so did HW Fowler, so does Garner’s Modern American Usage. And dedicated copy editors have done their best to make usage follow this distinction (including me, until I wised up). But usage persists in going its own way (Garner concedes that restrictive ‘which’ is ubiquitous). And no lack of clarity has resulted, because we use commas to highlight nonrestrictive clauses.

Heller has further concerns:

Another is the rule that “like” joins noun phrases, while “as” or “as if” is for verb phrases. (“It looks as if my date is here!” “You look like Mom in that dress.”) Pinker doesn’t see the point of that one, either.

His grammar terms aren’t quite right here: “my date is here” is a clause, not a verb phrase; “is here” is the verb phrase.

Again, this is a rule insofar as people like Heller insist on it, but by the standards of usage there’s far more leeway, depending on the situation. The AHD again:

They don’t make them like they used to. I remember it like it was yesterday. As these familiar examples show, like is often used as a conjunction meaning “as” or “as if,” particularly in speech. While writers since Chaucer’s time have used like as a conjunction, the usage today has a somewhat informal or conversational flavor. Language critics and writing handbooks have condemned the conjunctive use of like for more than a century, and in accordance with this tradition, like is usually edited out of more formal prose.

But it adds:

Like is acceptable at all levels as a conjunction when used with verbs such as feel, look, seem, sound, and taste: It looks like we are in for a rough winter.

And Garner accepts that conjunctive ‘like’ is “acceptable casual English”, although not yet “unimpeachable”. Again, genuine confusions caused by using ‘like’ in this way are vanishingly rare. Heller raises another issue:

It’s for grammatical consistency, not beauty or gentilesse, for example, that correct English has us say “It was he” instead of “It was him.” Pinker calls this offense “a schoolteacher rule” that is “a product of the usual three confusions: English with Latin, informal style with incorrect grammar, and syntax with semantics.” … [But] the nominative and accusative cases are the reason that we don’t say gibberish like “Her gave it to he and then sat by we here!” No idea is more basic to English syntax and grammar. In the phrase “It was he,” “it” and “he” are the same thing: they’re both the subject, and thus nominative.

I agree: it definitely isn’t for beauty that anyone says “It was he”. It sounds so prissy I can hardly bear to type it.

And Heller is right about the difference between nominative and accusative, but then he misapplies it because of another grammatical misunderstanding. In “It was he”, “It” and “he” are not both the subject; “he” is the object.

You can see this more clearly in the grammatically equivalent “It was they”. Subjects govern verbs, and here, the singular subject “it” gives us the singular form of the verb “was”. Then “they”, being plural, is obviously not the subject. Heller’s mistake is precisely the confusion of syntax with semantics that he quotes Pinker warning him against. The two pronouns have the same referent but not the same grammatical role.

All this nit-picking is cracking good fun, but it’s beside the point. According to usage, “It was he” lives happily enough at the formal end of the spectrum but struggles elsewhere. More from the AHD:

Traditional grammar requires the nominative form of the pronoun in the predicate of the verb be: It is I (not me); That must be they (not them), and so forth. But in less formal contexts the nominative pronoun can sound pretentious and even ridiculous, especially when the verb is contracted, as in It’s we. The traditional rule creates additional problems when the pronoun following be also functions as the object of a verb or preposition in a relative clause, as in It is not them/they that we have in mind, where the plural pronoun serves as both the predicate of is and the object of have.

Garner also agrees that “it is me and it’s me are fully acceptable, especially in informal contexts”. The resultant alleged lack of “grammatical consistency” causes no harm. One final complaint from Heller:

The same is true of “who” and “whom,” another nominative-accusative pair to which Pinker objects, sort of. He writes, “The best advice to writers is to calibrate their use of ‘whom’ to the complexity of the construction and the degree of formality they desire.” Yet who wants to undertake that calibration all the time? The glorious thing about the “who” and “whom” distinction is that it’s simple.

Whom the hell is this guy trying to kid? It’s easy to calibrate formality to context. We all do it all the time.

(And, as Ben Zimmer reports, even writers as good as Nathan Heller trip themselves up on the ‘who’/‘whom’ distinction.)

Garner also likes the nominative/accusative distinction here, but finds that “in certain contexts, whom is stilted”. The AHD agrees: “in practice whom is uncommon in speech and everyday writing because it has a formal tone”.

So, overall, what lies behind Heller’s objections?

This tendency to add complexity, ambiguity, and doubt is a troubling feature of Pinker’s rules.

This is telling. Heller wants language (and guidance on it) to be simple. That’s admirable, but his insistence on unbending rules can lead to a conception of language that is not just simple but simplistic. While he knows in general that language is complex, he seems to recoil from glimpses of this complexity.

Language is alive and its conventions are diverse: as such a vast category of human behaviour, how could it be otherwise? But to accept that is to abandon the certainties of the rulebook, and some people find that troubling. I know I did, at first. Then I found it exhilarating.

While I’m here: I mostly liked Pinker’s book. In the tradition of Joseph Williams’s Style, Pinker doesn’t just set out a list of dos and don’ts – he takes pains to explain what promotes clarity in writing. He covers how to structure a sentence so it’s easier to understand, how to show logical links between sentences and create a sense of flow through a paragraph, how to fight the “curse of knowledge” and focus on the reader’s perspective rather than your own, and a lot more.

These things matter so much. In my copy editing, I daily come across writing that needs these sorts of principles applied. They are far more important than the piddling little worries – like split infinitives and ‘literally’ and ‘very unique’ and double negatives and Heller’s points above – that dominate public debate about language.

On the negative side, I could have done without all Pinker’s sentence diagrams, which I wasted a lot of time trying to understand before realising the details were barely relevant. And the section near the end, where he lists a load of words that he thinks people should stop misusing, goes against the spirit of the rest of the book by not giving reasons. I would also have liked more detail on some of the academic studies he cites to back up his arguments – I don’t have the time or resources to chase up references.

But on the whole, it’s a damn good book.