The Institute of Physics has launched a campaign against using the word “boffin” to refer to scientists and other researchers. They argue that the word conjures up an unhelpful, outdated stereotype that could put young people off science. According to their survey: “When asked to describe what a boffin looks like in three words, respondents painted a clear picture: glasses, geeky, nerdy, male, white coat, serious, bald and posh.”
“Boffin” is one of those odd, slightly dated slang words that don’t much exist outside of UK newspapers. (See Rob Hutton’s book Romps, Tots and Boffins: The Strange Language of News for an insider’s guide to the lexicon of the Great British press.) So the Institute of Physics is in particular asking the big tabloid newspapers to stop using it.
This plea is obviously a red rag to a bull, and the Daily Star has duly (and quite magnificently) charged:
Daily Star headline reading “Boffins: don’t call us boffins”
According to the Star, and I think they mean it, they use “boffin” not just cheekily but affectionately. “We bow to no one in respect of our boffins. But the berks have buggered it up with this Bin the Boffin befuddlement.”
But one argument that might carry a bit more weight than avoiding stereotypes is that the word is – perhaps surprisingly – unclear. The Institute of Physics found:
Over a third of all adults and young people surveyed had never heard of the term before. For those who had heard the word before, there was confusion as to what boffin meant. Suggestions that were put forward of what boffin means included a kind of bird, a type of biscuit, or even a fancy coffin.
So maybe this is a word (for which I confess I have a tongue-half-in-cheek fondness) whose limited niche is going to contract as the generations change.
Its first known use, according to the boffins lexicographers at the OED, was during World War II. It started off meaning an older officer, but it soon shifted to mean “a person engaged in ‘back-room’ scientific or technical research” – perhaps because older officers were more associated with such roles. It was applied in the RAF to scientists working on radar:
Their ages are as youthful as air crews. Thirty-two is considered the maximum… In H.M.S. Wasps’ Nest, anyone aged thirty-two is officially a ‘boffin’. There is even a song about them… ‘He glares at us hard and he scowls, For we’re the Flotilla Boffins.’ (C Graves, 1941)
A band of scientific men who performed their wartime wonders at Malvern and apparently called themselves ‘the boffins’. (Times, 1945)
‘What’s a boffin?’ ‘The man from Farnborough. Everybody calls them boffins. Didn’t you know?’.. ‘Why are they called that?’.. ‘Because they behave like boffins, I suppose.’ (N Shute, 1948)
The origin is, to paraphrase the OED, anyone’s guess. Etymonline suggests that it may have been a reference to a fictional character, perhaps Mr Boffin in Dickens’s Our Mutual Friend. I haven’t read it, but I gather he’s a genial but bumbling figure, an illiterate former dustman who unexpectedly inherits money and hires a personal reader to broaden his education. Not quite sure how that would translate into the WWII usage, though. It feels like a private joke among a small group of friends which then caught on, changing as it did.
So what’s the alternative? The physicists suggest “scientist”, or being specific about the relevant specialism. The only problem there is that “scientist” is harder to fit in a headline than “boffin”. The only relevant word I can think of that’s comparably short is “expert”. But that, in UK politics, comes with its own cultural baggage.
Further research is needed! Send for the bo— [gunshot]
David Bentley Hart has written a witty, insightful, elegant and provocative piece on ‘How to write English prose’. Given the topic, it’s hard to judge the style and the substance separately.
On the whole, I enjoyed his writing. Savour this passage, on why cultures develop great prose far later than great poetry:
Poetry entered the world almost as early as words did; it is the first flowering of language’s intrinsic magic—its powers of invocation and apostrophe, of making the absent present and the present mysterious, of opening one mind to another. It comes most naturally to languages in their first dawn, when something elemental—something somehow pre-linguistic and not quite conscious—is still audible in them. Prose, however, evolves only when that force has been subdued by centuries upon centuries of refinement, after unconscious enchantment has been largely mastered by conscious artistry, and when the language has acquired a vocabulary of sufficient richness and a syntax of sufficient subtlety, and has fully discovered its native cadences.
That’s a gem.
I also relished his scorn for the unjustly famous writing advice of Strunk and White: “In fact, if you own a copy of The Elements of Style, just destroy the damned thing. It is a pestilential presence in your library. Most of the rules of style it contains are vacuous, arbitrary, or impossible to obey, and you are better off without them in your life.” He does Orwell too.
But Hart is a bit too fond of obscure words: within the space of a 200-word passage near the start, he introduced me to anfractuous, volutes, modillions and quadrature. I don’t mind being sent to the dictionary now and then – it’s good to learn new words – but each occasion acts as a dam that interrupts the flow of the piece. Too many, and the whole thing can dry up.
A proud indifference to the reader’s vocabulary, though, is part of his argument: great prose blends the simple and the complex, whereas nowadays too many writers hew to a bland, formulaic conception of plain English – “denuded of nuance, elegance, intricacy, and originality”.
I should point out that Hart means his recommendations to apply to literary prose – fiction, essays and the like – rather than more functional writing such as public health information leaflets. At least, for the sake of public health, I hope he does.
*
Time for some backstory.
I first encountered Hart’s writing a little over a decade ago, when he wrote a pair of pieces complaining about the myriad failings of common usage. He passed judgement on the correct meanings of words including infer, hopefully, fortuitous, intrigue, momentarily, presently, refute, restive, transpire, reticent, aggravate, enormity and fundament.
I’d say that on a majority of these words, his advice is sound – or at least that he raises a fair concern. But my way of thinking about these questions differs from his. I come at them looking at what aligns with current usage, while he seeks authority in the traditions of literature: “a word’s proper meaning must often be distinguished from its common use”.
On transpire, he’s particularly stern: “I am as inflexible as adamant, as constant as the coursing stars: it does not mean ‘occur,’ no matter how many persons use it that way.” Even allowing for theatrical exaggeration, the rejection of actual usage is unjustified dogmatism.
For me, the main factor is how readers will understand a word; for him, it’s how they ought to understand it. And this connects perfectly with his view on rarefied words.
In his new article, he offers a set of rules for writers. The very first one is:
1. Always use the word that most exactly means what you wish to say, in utter indifference to how common or familiar that word happens to be. A writer should never fret over what his or her readers may or may not know, and should worry only about underestimating them.
In a similar vein, his third rule is:
3. When the occasion presents itself for using an outlandishly obscure but absolutely precise and appropriate word, use it.
His merry indifference to the obscurity of a word, when it carries the precise meaning he wants, raises the same question as his prescriptivism: to whom does this word carry that precise meaning? In the esoteric cases of farraginous, purling, banausic and other selections he makes, the answer must be: not many people. Even, I suspect, among the readership of literary magazines.
You can view the Great Writers of yesteryear as the best guides to a word’s “proper meaning” if you want, but for the purposes of communication a word’s actual meaning is what – if anything – it means to the person reading it.
In this, I believe I’m in agreement with at least the spirit of Hart’s second rule:
2. Always use the word you judge most suitable for the effect you want to produce, in terms both of imagery and sound, as well as of the range of connotations and associations you want to evoke.
An effect produced is produced only in a reader’s mind; connotations and associations are always for someone; meaning is not independent of the community of language users and their understanding of words, an understanding that may vary from person to person, generation to generation.
*
Whether on obscure words or disputed words, Hart almost – almost – seems to believe that meanings are perfect Platonic forms, abstracted from the grubby, flawed, mundane business of human communication and existing in some transcendent realm to which mere mortals have only limited access.
I knew that Hart is a theologian, so I did a bit of googling (look, if you want proper research then you’re going to have to start paying me), and it turns out that, yes, he is a Platonist.
Here he is, some years back, on the idea of “truths deeply hidden in language”:
Consider, for instance, the wonderful ambiguity one finds in the word invention when one considers its derivation. The Latin invenire means principally “to find,” “to encounter,” or (literally) “to come upon.” Only secondarily does it mean “to create” or “to originate.” Even in English, where the secondary sense has now entirely displaced the primary, the word retained this dual connotation right through the seventeenth century. This pleases me for two reasons. The first is that, as an instinctive Platonist, I naturally believe that every genuine act of human creativity is simultaneously an innovation and a discovery, a marriage of poetic craft and contemplative vision that captures traces of eternity’s radiance in fugitive splendors here below by translating our tacit knowledge of the eternal forms into finite objects of reflection…
A god… whose works are then unnecessary but perfectly expressive signs of this delight, fashioned for his pleasure and for the gracious sharing of his joy with creatures for whom he had no need, is a God of beauty in the fullest imaginable sense. In such a God, beauty and the infinite coincide; the very life of God is one of, so to speak, infinite form; and when he creates, the difference between worldly beauty and the divine beauty it reflects subsists… in the analogy between the determinate particularities of the world and that always greater, supereminent determinacy in which they participate.
For me to pontificate on Christian theology would be ultracrepidarian, but the analogy between Hart’s view of divine creation and his prescriptivism about meaning does seem striking.
I doubt he believes that every English word (and every word in French, Japanese, Yoruba, Cherokee, Farsi, etc, etc) really has a Platonic true meaning independent of usage. I think it’s more likely that when he writes, he strives to emulate in some small way the spirit of creative, self-expressive joy that he describes above. He then shares his joy with readers – although we’re readers for whom he has no need. If we like what he writes, that’s great, but we’re not really the point. Hence his disregard for semantic understanding other than his own.
That’s a defensible (if self-indulgent) position for a literary stylist, but as a general philosophy of usage and meaning its authoritarianism literally defies comprehension.
*
Clearly Hart has a magnificent vocabulary, and revelling in language is a fine thing to do. And yet… of the seven words in his article that I didn’t know (not counting passages quoted from elsewhere), four appeared in the first 5% of the text, and two more in the following 15%. The remaining 80% contained only one. Maybe that’s just chance, or maybe he got bored of searching for those exact meanings as the task of writing wore on – or maybe he deliberately front-loaded the piece with a display of exotica to dazzle the reader before settling down to the business or arguing more intelligibly for linguistic complexity.
Who can say?
But consider this: in his article he offers as examples of great prose ten passages from other writers, totalling nearly 1,200 words – only one word of which caused me trouble (cunctation, from Thomas Browne more than 350 years ago). His star witnesses prove that brilliance can dazzle without blinding.
And so does he. Let’s end on a positive note, with another passage of Hart’s that I loved:
Language is magic. It is invocation and conjuration. With words, we summon the seas and the forests, the stars and distant galaxies, the past and the future and the fabulous, the real and the unreal, the possible and the impossible. With words, we create worlds—in imagination, in the realm of ideas, in the arena of history. With words, we disclose things otherwise hidden, including even our inward selves. And so on. When you write, attempt to weave a spell. If this is not your intention, do not write.
A recent study, asyoumayormaynothaveheard, has found that teaching grammar to Year 2 children (age 6-7) does not improve their writing. But that’s not what it found.
The study, by researchers at UCL and the University of York, did not compare grammar teaching with no grammar teaching. It compared one particular programme of grammar teaching, called Englicious, with the grammar teaching that schools were doing already, and it found that Englicious produced results that were essentially no better than the other grammar teaching.
The research paper makes this very clear, although the conclusions section seems to stray a little into over-generalisation. And the news articles published by the two universities both lead with the generalised claim:
The teaching of grammar in primary schools in England (a key feature of England’s national curriculum) does not appear to help children’s narrative writing, although it may help them generate sentences, according to new UCL-led research.
Lessons on grammar are a key feature of the national curriculum taught in England’s primary schools, but they don’t appear to help children to learn to write, new research reveals.
This is the angle the media coverage has largely followed. (Note to journalists: always read the PDF.)
The background
Grammar teaching, which has become a bigger part of England’s national curriculum since 2014, is contentious. Some see it as providing the essential building blocks of literacy and good communication, while others think it a bewildering morass of jargon that has little relevance to how people really speak and write.
I broadly support grammar teaching in principle, although I don’t have strong views on how much to teach at particular ages and I can’t claim any expertise on teaching methods. But I have liked the idea of Englicious since I first heard about it.
Englicious is a set of resources to help teachers run classes on grammar in line with the curriculum. It aims to address two of the big worries about grammar teaching: that it’s dull and off-putting, and that the theoretical knowledge of grammar is disconnected from children’s actual writing skills. Its approach is strong on interactive exercises, trying to incorporate some fun, and it links grammatical concepts with practical writing work. This sounds to me like a great idea.
The study
To test it, the researchers recruited primary-school teachers and put them into two groups: one group used Englicious with their Year 2 classes, after an introduction to the resources and training in how to use them; and the control group taught grammar using the various approaches they were already using (this is the crucial detail that most of the coverage has obscured).
The research paper highlights one distinctive feature of the Englicious approach:
One key difference between intervention and control schools… was that the Englicious lessons consistently included an opportunity for pupils to apply their learning through an independent writing activity that was part of the Englicious lesson. It appeared that this was not a typical approach in every lesson observed in the control schools. In the control schools a wide range of teaching strategies was seen being used to support learning about grammar, for example general approaches to grammar teaching that included using a text to contextualise teaching of grammatical terms and their properties; teacher-led strategies including deliberate inclusion of errors when presenting texts; whole-class activities including discussions while pupils were sitting on the carpet; and use of mini whiteboards for pupils to write sentences…
Because Englicious was designed to link grammar to writing, the main way the researchers assessed its effect was through a narrative writing test, in which pupils had to create a narrative based on a prompt. They also used a sentence generation test, in which pupils were given two words as a prompt and had to generate sentences using them both – a task more focused on grammatical understanding.
The findings
Looking at the test scores from before and after a ten-week period of teaching, the study found that Englicious had no effect on the pupils’ narrative writing scores relative to the control group, and that it may possibly have improved sentence generation scores a little, but this difference was not statistically significant (p=0.25).
The researchers gamely describe the second finding as “encouraging”, although I think “disappointing” would be a fairer assessment. Englicious may have been slightly helpful with actual grammar teaching (more research is needed), but it failed in its primary objective.
This doesn’t mean the approach is worthless: it seems to be at least as good as other current methods; it may have other benefits that weren’t covered by the two tests; and it might help pupils’ learning to last for longer than this ten-week study. And the teachers who used Englicious gave largely positive feedback on it in questionnaires, saying that the lessons were a positive experience for both them and their pupils. That counts for something. They also made some suggestions that could improve Englicious.
The confusion
But there’s no escaping the fact that the study didn’t find the main desired effect. And here, perhaps, is where the widespread misunderstanding of the findings was born. The paper says:
The lack of effect on narrative writing is the main outcome of our research, and is consistent with previously published studies on grammar and writing at primary education level.
These older studies, though, didn’t look at Englicious or the post-2014 English curriculum. The general thrust of the previous research, as this paper summarises it, is that grammar teaching, on the whole, shows little if any sign of improving pupils’ writing. To further support that conclusion, you would have to compare grammar teaching (of some kind) with no grammar teaching, not compare different kinds of grammar teaching.
One sentence on the paper’s methodology acknowledges this problem:
The context of England’s national curriculum requirements meant that it was not feasible to have a control group that did not have any grammar teaching, a control that some would regard as a better comparison.
That last bit seems quite an understatement.
In light of this problem, I struggle to see how the following conclusion is a fair reflection of a study that compared different methods of teaching grammar:
The research found that seven-year-old pupils’ narrative writing was not improved as a result of grammar teaching.
I suppose the researchers could say “We gave it a really good shot, but all we’ve got is another way that doesn’t work; this suggests that the whole idea of connecting writing skills to grammar teaching is doomed to failure.” But it’s clear that they think the Englicious approach has room for improvement.
The upshot
I wish the Englicious team the best in further developing their work. Their approach has merit – it’s just not yet clear how much. And I agree that there are broad grounds for concern about how grammar teaching can improve writing skills.
This was a valuable and well-conducted study – the first randomised controlled trial in the world to examine this topic. But I think it has been over-interpreted: in parts of the conclusion and the university press releases, and in media reports that don’t look too closely at the detail.
Grammar teaching is politically fraught. For many people, it taps into notions of authority, discipline and tradition versus liberalism, diversity and modernity. Debates on it can get pretty heated, with people often falling back on ideological preconceptions. So if we want to improve how grammar is taught, we all need to be clear about what the relevant evidence does and doesn’t show.
Dafne Keen as Lyra with Pantalaimon, from the BBC adaptation of Philip Pullman’s His Dark Materials.
Susan McDonald, an experienced subeditor at the Guardian, has written an article that appears to be about grammar and usage but is really about everyone’s favourite topic: how annoying other people are.
McDonald doubtless knows more than a thing or two about whipping ungainly sentences into shape. Her daily work involves tweaking punctuation, replacing clichés, shepherding stray verbs towards their subjects, and making all sorts of other small changes that smooth the path from the writer’s brain to the reader’s.
But she says she doesn’t nitpick for the sake of it, instead using common sense to decide when rules can be bent. I agree with that as a broad principle, but the thing about common sense is that it’s never as common as you think. What strikes one person as sensible flexibility will strike another as sloppy inconsistency; one person’s high standards will be another’s restrictive dogmatism.
McDonald gives some examples of things that definitely do matter (to her):
Some of my personal bugbears come up more regularly than others.
“Compared to” is different from “compared with”. Use the first if you want to liken one thing to another, the second to contrast.
And that reminds me: in my book “different from” is correct, “different to” and “different than” are not.
“Who” and “whom” are both needed but not interchangeable. The same goes for less/fewer, like/such as and imply/infer.
As a copyeditor, I think I would be absolutist about only one of these six. For moderately formal pieces, I’d probably apply three or four more of them across the board and the other one or two depending on context.
But I would also usually avoid using ‘regularly’ to mean ‘frequently’, as McDonald does here – so on that point I’m more of a stickler than her. And there are people who would scorn both of us for beginning sentences with ‘and’.
There’s no objective way of telling which ‘rules’ are the ‘correct’ ones. Any of us can talk about what’s right ‘in my book’ – but each of our mental rulebooks is different.
Some people respond to these differences by saying that the safest thing to do is always follow all the rules – that way, you won’t annoy anyone. But a lot of these (alleged) rules are, to put it politely, stupid. Picking every conceivable potential nit would be enormously time-consuming and make prose awkward, stiff – even annoying. McDonald rightly points out that, for instance, split infinitives and sentence-ending prepositions often produce better results.
A lot of these decisions are judgement calls. You have to think about audience, medium and desired effect. You have to keep abreast of how people are using the language and how they expect it to be used. You have to estimate which fine distinctions are too fine to be reliably clear, and you have to have a nose for what might be likely to cause a stink.
You also have to remember that the people who complain about ‘broken rules’ are far louder than those whose reading is eased by a certain breakage – but that doesn’t mean the loud ones are anything like a majority.
Sometimes there’s no right answer. Language isn’t like mathematics; it’s like life.
McDonald describes her linguistic gripes as bugbears, and many people talk semi-fondly of having pet peeves, but really these attitudes are more like Philip Pullman’s dæmons – they’re aspects of ourselves. They are changeable during childhood but become fixed as we grow up. They might cause us annoyance, but they are a dear, cherished part of who we are, and any attempt to separate them from us causes terrible pain.
The last line of McDonald’s piece is:
Language reflects – and can even define – who we are. So a little respect, please, for its rules.
It’s not just language but also our attitudes to language that are part of our identities. But the ‘rules’ we get the most righteously angry about don’t belong to language in itself. They belong to our personal conception of it. And when we meet someone whose internal rules are frustratingly different, we have two options: banish their dæmon or pacify our own.
Neither is easy.
By Tom Freeman|Comments Off on Bad grammar, bugbears and dæmons
Clare Foges has written a passionate column in the Times about the dangers posed by linguistic prejudice.
She highlights evidence of discrimination against people, especially young people, whose speech doesn’t fit in. Civil servants who lack the “right accent” are less likely to get promoted, even if they do good work; in industries from law to accountancy, those who don’t sound middle-class enough are less likely to be hired, even if they have good grades; and many professionals give lower ratings to answers delivered in certain varieties of English, even when the content of those answers is good.
As Foges says, people who speak dialects other than standard English “are surrounded by invisible barriers to success, yet we as a nation are too squeamish to say anything about it”. I commend her for raising this injustice, and for her directness in saying that “We do young people no favours by pretending that the way we speak doesn’t matter any more, because it does.” Collectively, we do need to get better at talking about our language and the role it plays in society.
The current situation isn’t just bad for those who are looked down on because of the way they talk. It’s bad for all of us. It means that the legal system is needlessly denying itself valuable talent. It means that the accountancy firms that businesses rely on are carelessly tossing out some of the best recruits available. And it means that we, the taxpayers, are getting poor value for money when mediocrities get ahead in the civil service while more capable public servants are overlooked.
How, then, can we combat this economically and socially ruinous linguistic prejudice? Foges has a simple solution: elocution lessons.
Wait, what?
Yes, she wants to give young people “speech coaching”, to endow them with the diction of the middle classes and the grammar of standard English. The best way to deal with prejudice, it seems, is to eliminate its object.
She briefly considers an alternative proposal – that linguistically prejudiced people should “challenge their biases” – but she isn’t interested. Because, you see, she agrees with them that standard English is better than other dialects. Here’s her argument:
Standard English is best because (the clue is in the name) it is the standard, with rules the vast majority understand. It is the medium through which writers and speakers of the language can achieve maximum clarity and minimum confusion. This is why deviating from it can grate. If people speak sloppily, mangling their grammar and failing to enunciate their words properly, language turns from a window between souls into a wall between them — and swiftly, subconsciously, we label the speaker.
In the spirit of diplomacy, I will try to meet Foges partway on this.
I agree it is important for children to learn standard English – but not because it’s better than other dialects or more precise or more expressive. It’s useful to know because it’s widely used in the public sphere, in business, in academia… in the kind of professions traditionally dominated by people who grew up in well-to-do families that speak standard English.
But standard English didn’t achieve its high social status in some merit-based competition, beating other dialects because of its intrinsically superior grammatical conventions and vocabulary. It got where it is because of William the Conqueror.
Before 1066, the capital of England was Winchester, which had previously been part of the Kingdom of Wessex. Back then, regional differences in English were much bigger than they are today, but the West Saxon variety – spoken in Winchester, the seat of power – looked set to become top dog. After the Norman Conquest, though, the capital was moved to London. The locally spoken Mercian became the high-status dialect, and out of it grew what we now call standard English.
While I bear our friends and neighbours in France no ill will for that ancient act of aggression, I cannot fathom why we in England should let a long-dead Frenchman determine which variety of our own language is best.
Standard English is only standard because of an accident of history. And different dialects are not “deviations” from it, they’re not “sloppy” or “mangled” attempts to speak it: they’re just different. What they lack is not clarity but prestige.
Yes, kids should learn standard English, but they should also learn that English is a family of dialects, related to region, class, and more recently ethnicity. All of these dialects change, including standard English; they influence each other and their borders overlap. (Foges complains about the adoption of Multicultural London English by “teenagers in country towns who desperately want to appear cool”, but changing the way you talk to make yourself come across a certain way is exactly the policy she recommends.)
But if we want to reduce the ill-effects of linguistic prejudice, there’s another side of the coin. Those of us who are perfectly at home with standard English should be careful how much importance we attach to whether others speak like us, especially if we’re in positions where we can help others to advance.
I am an editor. It is literally my job to improve other people’s writing – pretty much always in standard English. I fix typos, I substitute words, I tweak grammar, I rejig paragraphs, trying to help my colleagues come across as clearly and effectively as possible. What I absolutely do not do is judge those of them who are less comfortable than me with the niceties of standard English.
I’ve worked with a lot of people who are good at all sorts of things – and some of them not so good – but I’ve never noticed that their skills and commitment have any correlation with their dialect, accent and enunciation.
So this attitude described by Foges is a lamentable mistake:
When someone says “could of” instead of “could have”, or “pacifically” instead of “specifically”, or “froo” instead of “through”, they are labelled. The interviewer labels them “not sharp enough”. The colleague labels them “not up to it”.
It’s not quite clear whether she shares this attitude herself, but it’s damn clear that she’s not going to do anything to oppose it.
Foges ends by quoting Henry Higgins talking to Eliza Doolittle:
Think what you’re dealing with. The majesty and grandeur of the English language, the greatest possession we have. The noblest thoughts that ever flowed through the hearts of men are contained in its extraordinary, imaginative, and musical mixtures of sounds.
But Eliza’s English is no less English than Henry’s. If all our speech were standardised, those mixtures of sounds would be so much less extraordinary, imaginative, and musical.
Every so often a kerfuffle erupts about the teaching of grammar in English schools, and the focal point these days is often the ugly term “fronted adverbials”. What on earth are these obscure things, people wonder, and why are young children being forced to learn about them?
Two daunting words, one simple concept
Let’s start with “adverbial”. It looks like an adjective, doesn’t it? “Of or relating to adverbs”, that sort of thing. But, unhelpfully, it’s actually a noun. An adverbial does the same job as an adverb (modifying a verb or a clause, typically to express manner or time), but it can consist of more than one word.
And “fronted” means it’s at the start of a sentence or clause. That’s not too tricky to grasp, but we hardly use “fronted” to mean that in other contexts, so the phrase feels kind of strange.
Here are some fronted adverbials:
Cheerfully, I bit into the apple.
Yesterday evening, they went out.
Further along the road, a cat was sitting on top of a car.
When the going gets tough, the tough get going.
It’s not nearly as hard as it sounds. Whichever linguist came up with the term needs to be docked a week’s pay, but a fronted adverbial is a pretty basic grammatical device that we all use every day.
(The term dates back to the 1960s, but it’s only recently escaped from academia to bother the public at large.)
Well, we never needed to be taught that in my day
Most of us have been confused by why this new term is being taught in primary schools when we as adults have managed perfectly well without it.
I suspect that kids may not find this kind of novel jargon nearly as discombobulating as we adults do. When you’re at school, you expect to learn new things, to be taught new words for new ideas, all the time. “Fronted adverbials”, for all its unintuitive awkwardness, is just another to add to the list. But as adults, we like to think we already know what’s what.
It can be uncomfortable to find your children learning things that you don’t know, but that doesn’t necessarily mean that the new teaching is needless or too advanced, or that you are stupid. It just means that teaching has changed – in this case, because it’s become informed by a better understanding of English grammar.
(One of my nieces, aged five, impressed me by introducing me to the word “digraph”. It means a pair of letters that combine to make a single sound. So in the word ”shoot”, “sh” and “oo” are digraphs. Simple. But the term itself is baffling if you don’t already know it.)
I’m neither a parent nor a teacher, and I don’t have a view about how old children should be when they meet adverbials and their fronting. I would say, though, that this concept should be taught not just as a term to memorise, with bland exercises that require kids to identify or create example sentences on demand. Fronted adverbials are a way of shifting emphasis, of re-ordering information, of changing the rhythm of a sentence. Children should be guided to try out using them, or not, in passages of writing to see the different effects that result. That’s how a knowledge of grammar really helps to improve your writing ability.
And yes, you can achieve that without knowing that there’s this specific term, but having a term can help you identify and think about the concept more tangibly than just “that bit at the end could go at the front”.
The theory can help with the practice. You can walk and run and jump without knowing anything about the muscles and bones in your legs, but if you want to train to get really good, or to deal with the risk of injuries, it’ll help to know a bit about what’s going on inside.
Fronted adverbials are only part of it, though. There’s a lot more grammar in the curriculum than there was in my day, and the grammar taught nowadays is quite different from the grammar that was taught in my parents’ day.
Let’s have a bit of a history lesson.
The decline and fall of the Roman linguistic empire
Social climbers, in their yearning for prestige, often adopt the language of the upper classes – often unconvincingly. The same thing happened to English grammar itself.
A rough, common tongue that grew up on a rainy island on the outskirts of Europe eventually came of age, and it realised that despite its growing literary prowess it still lacked respectability. So it modelled itself on the noblest and most distinguished language of all: Latin.
Sadly, some of the concepts and categories and distinctions used in Latin grammar didn’t fit English very well, but enough of the literati and educational establishment still insisted on adopting them. For this act of vanity, generations of schoolchildren have suffered.
Roman rule in Britain ended before the (Old) English language had even taken hold. But, a millennium later, Rome’s dead language colonised our understanding of our own. It ruled, ineptly but firmly, for a few centuries. Then, after a string of revolts, Latinised English grammar was finally overthrown in the 1960s, and – after a few decades of anarchy – a more authentic understanding of English grammar is now being applied in classrooms.
The new regime has had no shortage of teething problems, and sometimes its proponents implement their concepts with heavy-handed revolutionary zeal. It also has to contend with nostalgics, counter-revolutionaries, and people who just don’t want to have to think about this stuff.
The aversion to “fronted adverbials” is part of this. It’s a sound concept but it comes across badly. While the theory of grammar is in a much better place than it used to be, it will take a while to figure out how best to present the ideas in schools.
But what’s the point of talking about fronted adverbials?
This morning Michael Rosen, who abominates the current state of grammar teaching, wrote:
Just last week I was privileged to record a radio programme about writing with one of the great modern writers, Hilary Mantel. We talked about the sound and rhythm of sentences, the struggle to find the right word, the shaping of a paragraph so that it sets a scene before introducing a character, and much more. We talked for nearly an hour and we did not mention a fronted adverbial once.
The shaping of a paragraph so that it sets a scene before introducing a character. That’s exactly what Rosen does here, and he uses a fronted adverbial – “just last week” – to do it.
They’re worth knowing about. I just wish they had a better name.
The remarkable thing about language change is that it only started happening when I started noticing it. For centuries, English was constant and true, but as soon as I was old enough to have an appreciation of good standards of usage, people around me started falling short. Since then, there has been an alarming, unprecedented surge in rule-breaking.
Neither I nor anyone else really believes any such thing, of course, but some of us sometimes talk as if we do. One such person is Lionel Shriver.
In an article in Harper’s, she wages war on what she calls “semantic drift”. Using the rhetorical style that’s obligatory for such pieces – mock-theatrical (and therefore deniable) moral horror – she rails against “decay”, “degeneration”, “blight”, “barbarism”, “mob rule” and the replacement of “civilised” with “contaminated” English at the hands of “animals”. Shriver’s a fantastic writer, but this kind of thing is just tiring.
The substance of this linguistic apocalypse is, as she sees it, the ignorant modern misuse of words such as literally, nonplussed, notorious, performative and enervated, and the blurring of distinctions such as less/fewer, as/like, who/whom and that/which.
On some of these, I think she has a point. While it’s unlikely anyone will be genuinely confused by “My head literally exploded”, the near-opposite meanings that nonplussed now has make it hard to use reliably. And it’s handy, even if only for formal occasions, to know how to whom. The that/which distinction, on the other hand, is needless. Most Brits (and a good many Americans) are indifferent to it, with no ill effects.
But Shriver’s examples of “semantic drift” also include grammar (flat adverbs and pronoun case) and punctuation (comma splices and indiscriminate dashes), so I guess the word semantic is drifting quite a bit too. She also makes it drift to include pronunciation, claiming that “‘flaccid’ is actually pronounced ‘flak-sid’”. In light of usage, which she accepts is almost entirely ‘flassid’, the meaning of actually must have drifted as well.
OK, that was cheap snark. But it gets us to the heart of the matter: what determines the actual rules of English?
There’s a view that the rules are wholly independent of the usage of English speakers, that the theory is what’s real and true while the practice is at best an approximation and more often a travesty. On this view, usage is evidence of nothing other than failure and corruption.
Nobody really believes any such thing, of course, but some of us sometimes talk as if we do.
The Good Book or the guidebook?
Shriver was raised as a language stickler, and the pedantry she inherited from her parents she reinforced at school. But for her, the ultimate authority, the guide to “official” English, is The Dictionary. She knows that she is fallible – her parents too – and is willing to take corrections when appropriately justified:
Hence when the copy editor on my first novel claimed that there was no such word as “jerry-rig,” I was incensed. Determined to prove her wrong, I went to my trusty, dusty-blue Webster’s Seventh (based on the august Webster’s Third), only to find she was right: “jerry-rig” wasn’t listed. Apparently I’d grown up with a garbled portmanteau of “gerrymander,” “jerry-build,” and the word I really wanted: “jury-rig.” The scales fell from my eyes.
A convert, I explained to my mother her lifelong mistake, but she was having none of it. “Oh, no,” she said gravely. “‘Jury-rig’ refers to rigging a jury, which is very serious.” Explaining the allusion to a “jury mast,” a makeshift sail, with no etymological relationship to a judicial “jury,” got me nowhere. It’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.
But there’s a twist: nowadays, dictionaries list the “incorrect” spelling as standard. “The mob – and my mother – have won.” Shriver, though, isn’t going to budge. Even though recent dictionaries now align with the way most people spell it – and the way Shriver herself long did – she has found her truth and she’s sticking to it, with the zeal of a convert whose prophet has snuck off to the pub.
For Shriver, a dictionary should be a rulebook of almost scriptural immutability. She wants usage to adhere to the rules that she spent time and effort internalising; any deviation, whether by the ignorant masses, by trendy literati or by dictionaries themselves, is to be fought.
The better way to view a dictionary is as a guidebook. It describes the features of the language as you’re likely to encounter it, and it thereby helps you find your way around. To do this, a dictionary needs to record differences in usage and it needs to be able to change.
Don’t just take my word for it, though.
Shriver’s “trusty” Webster’s Seventh New Collegiate Dictionary is a 1963 abridgement based on the “august” full-length Webster’s Third New International Dictionary, which came out in 1961. The Third was not seen as august at the time. In fact, it outraged many contemporary sticklers, who were appalled by its permissive, descriptivist approach. In the preface (the bit that nobody reads), its editor, Philip Gove, wrote that “a definition, to be adequate, must be written only after an analysis of usage”. He concluded:
This new Merriam-Webster unabridged is the record of this language as it is written and spoken. It is offered with confidence that it will supply in full measure that information on the general language which is required for accurate, clear, and comprehensive understanding of the vocabulary of today’s society.
Today’s society. As a new dictionary, it paid no heed to the aggrieved traditions of yesterday’s sticklers. And Gove knew that his work – his guidebook – would have a shelf-life. He knew that some of the language his team mapped would change in years to come. He wouldn’t have wanted the book to treated as scripture almost six decades later.
But that scripture is what Shriver grew up with. That book formed part of the fundamental order of the world as she was honing her command of English, so it’s understandable that departures from it seem like creeping anarchy, like the destruction of something precious – like a “bereavement”, even.
Each generation thinks it invented language change
Maybe I can offer a scrap of consolation. Despite Shriver’s fears, language change definitely isn’t her fault.
Noting that she is more liberal than her father on some matters, such as the meaning of decimate, she says: “my own generation probably instigated this decline in the first place”.
Not guilty. Decimate slipped the bounds of “reduce by one-tenth” to start meaning “destroy a large part of” as early as 1663.
And some of the recent changes that make up her bugbears are not that recent:
Notorious, Shriver says, doesn’t just mean “well-known”. But the word dates back to the 15th century, when originally it meant exactly that. Over the years it acquired negative connotations, and for a long time it has mostly been used negatively – but only mostly.
She deplores the modern use of quicker as an adverb. But here’s Tennyson in 1865: “Nature… on thy heart a finger lays, Saying ‘Beat quicker’.” Adverbial quick has been in constant use since 1300 – informal, but hardly disreputable.
Performative is a term in linguistics, relating to utterances that enact what they state: “I promise”, “I warn you”, “I apologise”. Nowadays most people use it to mean “relating to performance”, but the correct word for that, she says, is performatory. In fact both words have a patchy history. JL Austin coined the technical sense of performative in 1955, but for several years before that he had been using performatory that way. For the performance-related meaning, performative goes back half a century earlier and is the norm today. Despite Shriver’s pessimism, the word’s linguistic meaning is alive and well too – among linguists. Many words comfortably carry more than one meaning, depending on context. We don’t need performatory and we shouldn’t mourn it.
Change didn’t begin with the baby boomers. It’s always been happening (and people have always been complaining about it). The rules Shriver grew up with were simply the customs of the day – some ancient, some much newer. Most are still in place, but the changes stand out. And even when the changes aren’t changes, the realisation that many or most people don’t follow your preferred conventions can be disconcerting.
Our language is part of our culture, our identity. We like things to be done our way, and we like to think that our way has some objective, enduring superiority. So yes, it’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.
You know that electric thrill when something jostles a shapeless old sack full of dim notions in one of the dusty, dark rooms at the back of your head and it rolls overs and its contents somehow rearrange themselves and suddenly coalesce to form a coherent idea that bursts out of the sack, crackling with energy, glowing with light, beautifully new and familiar at the same time?
It’s a great book, lucid and insightful, a riposte to the grumpy sticklers who complain that our language is going to the dogs, who see every change as a sign of decay. Greene rejects their implicit view of English as “an easily threatened logical system for conveying meaning without loss or confusion, which could crumble if we don’t mind it carefully”. Instead, he says, we should realise that language is “a robust, organic and evolving phenomenon that needs relatively little intervention”.
Our language may be untidy, inefficient and imprecise at times, Greene says, but it is hardy, resourceful and adaptable.
To make his case, he explores the natural, social history of language change, the links between language and national identity, and the enormous difficulties of teaching machines to speak. He marvels at the dreamers who have tried to invent logical new languages and despairs at the clueless certainty of amateur grammarians. He even explains the underrated linguistic skills of Donald Trump.
The bit that really made my brain perk up, though, was a single well-placed turn of phrase.
Greene discusses what happens when children first bump into unexpected “rules” of English, rules that clash with the language they’ve picked up from the adults around them. Lots of us can remember a teacher telling us not to say “Can I?” when asking for permission, or not to say “Me and Billy went”…
When children are suddenly told that what they know their parents and virtually everyone else says, and what they have been saying all their lives thus far, is “wrong”, there’s an early disconnect between the child’s native competence and the new idea of an invisible but Platonically correct language out there, one that nobody seems to be using.
As Greene says, the typical reaction to this is puzzlement and even humiliation at being wrong when you’re sure you were right. If you continue to be bruised by encounters with “grammar”, you’ll grow to resent it.
But here’s what this made me think: Not all children react in the same way to those early encounters with the Platonic realm of proper English.
There’s a scene from a David Tennant episode of Doctor Who where the Doctor talks about how the Master first became power-crazed:
When he was a child. That’s when the Master saw eternity. As a novice, he was taken for initiation. He stood in front of the Untempered Schism. It’s a gap in the fabric of reality through which could be seen the whole of the Vortex. You stand there, eight years old, staring at the raw power of time and space, just a child. Some would be inspired, some would run away, and some would go mad.
When kids glimpse Platonic English, some of them realise that they’re supposed to learn how to use it, and they make partly successful efforts to do so. Some are distrustful, thinking it a threat to their own language, and they try to avoid it. And some stare at this elusive, mysterious knowledge – a purer, higher truth that sets the elite few apart from the masses – and they go mad.
Is this the moment when apparently ordinary children turn into budding grammar-peevers? Is this the catalyst that makes a love of knowledge fuse with a respect for authority to produce a desire for linguistic superiority?
When I remember my own early experiences of being “corrected” by teachers, I think I managed to avoid going mad. Just.
I was one of those annoying kids who took pride in being clever, and I had a pedantic streak that nowadays is mostly under control. More charitably: I liked understanding things. And the rules of English were something I could take an interest in. I was at risk of proto-peeverism, and as young as seven or eight my speech and writing were already more formal than the average kid’s (even the average smart-alec kid’s), but I managed to keep my feet on the ground. During my teens I grew out of the idea that formal language was a sign of sophistication and intelligence and maturity.
But some people don’t grow out of it. They grow ever more fixated with the Platonic world, anxious to protect its perfection from the barbarian hordes of real life and zealous to conquer the rest of us in its name.
If they were to read Talk on the Wild Side, they might learn to ease up. Maybe just a little.
The other day I saw this sign at a railway station:
Sign at Euston station: “When you alight the escalator keep moving forward”
Sorry the photo’s a bit blurry, but funnily enough I was on a moving escalator at the time.
When I saw it I thought: “‘Alight the escalator’? What in the name of Samuel Johnson’s gout-ridden ghost is this sorry abomination? Shouldn’t it be ‘alight from the escalator’?”
I hadn’t ever seen “alight” as a transitive, non-phrasal verb before. So, after apologising to the people I knocked over when I stopped to take the picture, I ran to the dictionaries – Oxford, Collins, Chambers, Macmillan, American Heritage, Merriam-Webster – but none of them had this transitive usage.
The norm is “alight from”, although the OED notes some historical uses of “alight out of”, “alight down from” and “alight off”.
So, is “alight the escalator” a quirk of one sign writer or an up-and-coming innovation? Or maybe even a nonstandard usage that has been rumbling along unnoticed for a long time?
It makes me think of “depart”, which is another official-sounding word strongly associated with transport signs and announcements, and which is increasingly being used transitively without a “from” (compare “depart from the station” and “depart the station”). But from what I can tell, transitive “alight” is a lot rarer than transitive “depart”.
When I started looking into usage of the verb “alight”, I found something surprising: there isn’t much. I already knew it was one of those quaintly formal bits of transport-speak, even more so than “depart”, but I hadn’t realised just how rare it is.
In the Corpus of Contemporary American English (a record of 520 million words used during 1990–2015), I found just 34 instances of “alighting/alighted from the”. There was one transitive, without the “from”.
In the Glowbe corpus of online usage, 774 million words from US and UK websites contained just 16 uses of “alighted/alighting from the” and five of “alighted/alighting the”.
These are tiny numbers.
A search of Google Books produced more examples of both. Uses of “alight/alighting/alighted from the train” massively outnumber those of “alight/alighting/alighted the train”. But still, the vast majority of the latter were results like “after you alight, the train will pull away” and “I should not have alighted. The train was gone…”
Most of the definite examples of transitive “alight the train” that I could find were post-2000. The earliest was in a US Congressional committee report from 1975.
So it seems that transitive “alight” is flitting around the edges of usage. If it starts to catch on, it will find itself pursued by a mob of angry pedants with plasticine pitchforks.
Like it or not (and I don’t), I think it could spread fast. When a word’s use is dominated by a small group – such as railway staff – there’s more scope for idiosyncrasy in that group to change that use. If signs like the one I saw appear across the country, and if announcements follow suit, that will set the tone for the rest of us.
But who knows? Anyway, there’s a bit more to this story: the beginning.
The earliest “alight from” in the OED’s records is from 1477, in William Caxton’s translation of Raoul Le Fèvre: “Peleus and Iason were alighted from their hors.” But “alight of” and other variations go back all the way to Old English, the first in the OED being from Ælfric of Eynsham in the late 900s: “Ic geseah þurh Godes gast, þa se þegen alihte of his cræte” (I saw through the Spirit of God that the officer alighted of his chariot).
And here’s the funny thing. Back then, there was a transitive sense of “alight”. It’s been obsolete since the 1600s, but it meant “to make light, or less heavy; to lighten, alleviate (a burden); to relieve (a person) of a burden”.
Here’s an early example, also from Ælfric: “a ealdan cyningas on ðam aerran timan hogodon hu hi mihton heora byrðena alihtan forþan ðe an man ne mæg æghwar beon” (the old kings in previous times thought about how they might alight their burden, for one man cannot be everywhere).
And one from Reginald Pecock in 1449: “for this cause of aliȝting the poor men it is alloweable and profitable, that lordis and ladies haue mansiouns with inne the cloocis gatis”.
These two are figurative uses, but it could be literal as well: if you got off your horse, you would make it lighter; you would relieve it of its burden; you would alight it.
And if there had been an escalator around for you to get off, you would have alighted that too.
The only thing as bizarre and horrifying as the Trump administration’s loudening belches of vicious, incompetent corruption is the coverage thereof in the New Yorker. Specifically, the punctuation.
The reasoning for the punctuation of “Jr.,’s” is pretty straightforward. It’s a collision of conventions. The first convention is The New Yorker’s—we place a comma before “Jr.” Doing so leads to another of our conventions: when something like “Jr.” occurs in the middle of a phrase, clause, or sentence, it is set off by its preceding comma and a following comma. Thus: “Ed Begley, Jr., was in ‘St. Elsewhere.’” A third convention is one that we all accept: the possessive is indicated by the addition of an apostrophe and “s.” We (the magazine) like our punctuation; we set things off with commas a lot; it drives some people nuts (i.e., it’s “bullshit”). This reaction is not surprising; it is also not new. With “Jr.” occurring in the middle of a line, where else is the possessive indicator supposed to go?
I’m happy to tell them where to stick it.
It’s silly
Boynton’s case all sounds very logical, but likewise it’s very silly. The New Yorker is a magnificent publication, and one of the most carefully copyedited in the world, but some of its style conventions are odd – and applied too zealously.
A house style on punctuation (or any other aspect of language) exists to help the reader get at what the writer is saying. Consistency is a big part of that: if an article’s comma style varies from one paragraph to another, it can be disorienting and distracting.
But English is a magical beast, and the ways we can ride it are myriad. This means that too much consistency can get in the way. Sometimes a mostly reasonable style rule can cause trouble, and the pile-up that is “Jr.,’s” is one of those times. In these cases, a wise copyeditor will break the rule – or find a way to avoid it.
Also, it’s wrong
Oh yes, and neither comma should be there anyway, regardless of the apostrophe. A New Yorker convention this may be, but the pair of them are confusing and misleading. To explain, I’ll rely on the magazine’s wonderful “Comma Queen”, Mary Norris.
She has repeatedly defended the first and second commas in this notorious New Yorker sentence: “Before Atwater died, of brain cancer, in 1991, he expressed regret.” Critics think these two commas needlessly fussy, but Norris says:
I took a good, hard look at the magazine’s policy, and I persuaded myself that in fact these commas were not indiscriminate. They marked off segments of the sentence that were not germane to the meaning. The point of the sentence… is that Atwater expressed regret before he died. What he died of and when he died of it are both extra details that the author, Jane Mayer, provides only to satisfy the reader’s curiosity. They aren’t essential to the meaning of the sentence. They are nonrestrictive.
I don’t wholly agree with this policy, but Norris’s reasoning is clear. The two phrases bracketed by commas – “of brain cancer” and “in 1991” – are inessential asides. You could cut them and the main statement would be intact: “Before Atwater died, he expressed regret.”
With that in mind, let’s look at the Trump headline again:
Donald Trump, Jr.,’s Love for Russian Dirt
Here, “Jr.” is bracketed by the pair of commas. It looks like one of those inessential, nonrestrictive asides. So let’s try cutting it to see if it leaves the main part intact:
Donald Trump’s Love for Russian Dirt
Nope. When there are two Donald Trumps in a story, “Jr.” and “Sr.” are absolutely essential. But the commas suggest otherwise.
Perhaps there is a line in the New Yorker style guide saying, “Oh, but in this sort of case, the commas are working differently.” But, if so, the New Yorker readers don’t know this.
One of the lessons it took me a while to learn as a copyeditor is that you must work to satisfy readers who neither know nor care what your style guide says. If instead you work – however accurately – to correspond to a geekily intricate but internally consistent set of arcane rules, you are failing. You are working to satisfy yourself and your peers.
Those commas, like the Donald Trumps, are a repellent pair that I wish would appear in the news less often.