Crimes of grammar and other writing misdemeanours


fountain-pen-on-paper

Roslyn Petelin, The University of Queensland

Writing an article like this is just asking for trouble. Already, I can hear one reader asking “Why do you need just?” Another suggesting that like should be replaced by such as. And yet another saying “fancy using a cliché like asking for trouble!”

Another will mutter: “Where’s your evidence?”

My evidence lies in the vehement protestations that I face when going through solutions to an editing test or grammar quiz with on-campus students in my writing courses at The University of Queensland, and no, that’s not deferential capitalisation. It is capital ‘T’.

Confirming evidence lies in the querulous discussion-board posts from dozens of students when they see the answers to quizzes on the English Grammar and Style massive open online course that I designed.

Katie Krueger/Flickr

Further evidence lies in the fervour with which people comment about articles such as the one that you are currently reading. For instance, a 2013 article 10 grammar rules you can forget: How to stop worrying and write proper by the style editor of The Guardian, David Marsh, prompted 956 comments. Marsh loves breaking “real” rules. The title of his recent book is For Who the Bell Tolls. I’d prefer properly to proper and whom to who, but not everybody else would.

Marsh’s 10 forgettable rules are ones that my favourite grammarian, Professor Geoffrey Pullum, co-author of The Cambridge Grammar of the English Language calls zombie rules: “though dead, they shamble mindlessly on”. A list of zombie rules invariably includes never beginning a sentence with “and”, “but”, or “because”, as well as the strictures that are a hangover from Latin: never split an infinitive and never end a sentence with a preposition. It (should it be they?) couldn’t be done in Latin, but it (they?) can be done in English. Just covering my bases here.

So, what’s my stance on adhering to Standard English? I’m certainly not a grammar Nazi, nor even a grammando, a portmanteau term that first appeared in The New York Times in 2012 that’s hardly any softer. Am I a vigilante, a pedant, a per(s)nickety person? Am I a snoot? Snoot is the acronym that the late David Foster Wallace and his mother — both English teachers — coined from Sprachgefühl Necessitates Our Ongoing Tendance or, for those with neither German nor a cache of obsolete words in their vocabulary, Syntax Nudniks of Our Time.

David Foster Wallace
yoosi barzilai Flickr

Foster Wallace reserves snoot for a “really extreme usage fanatic”, the sort of person whose idea of Sunday fun would have been to find mistakes in the late William Safire’s On Language column in the New York Times magazine. Safire was a style maven who wrote articles with intriguing opening lines such as this: “A sinister force for solecism exists on Madison Avenue. It is the work of the copywrongers”.

Growing up with a mother who would stage a “pretend” coughing fit when her children made a grammar error clearly contributed to Foster Wallace’s SNOOTitude. His 50-page essay “Authority and American Usage”, published in 2005, constitutes a brilliant, if somewhat eccentric, coverage of English grammar.

I need to be a bit of a snoot because part of my brief as a writing educator is to prepare graduates for their utilitarian need to function as writing workers in a writing-reliant workplace where professional standards are crucial and errors erode credibility. (I see the other part of my brief as fostering a love of language that will provide them with lifelong recreational pleasure.)

How do I teach students to avoid grammar errors, ambiguous syntax, and infelicities and gaucheries in style? In the closing chapter of my new book on effective writing, I list around 80 potential problems in grammar, punctuation, style, and syntax.

My hateful eight

My brief for this article is to highlight eight of these problems. Should I identify ones that peeve me the most or ones that cause most dissonance for readers? What’s the peevishness threshold of readers of The Conversation? Let’s go with mine, for now; they may also be yours. They are in no particular order and they depend on the writing context in which they are set: academic, corporate, creative, or journalistic.

Archaic language: amongst, whilst. Replace them with among and while.

Resistance to the singular “they” Here’s an unbearably tedious example from a book published in 2016 in London: “The four victims each found a small book like this in his or her home, or among his or her possessions, several weeks before the murder occurred in each case”. Replace his or her with their.

In January this year, The American Dialect Society announced the singular “they” as their Word of the Year for 2015, decades after Australia welcomed and widely adopted it.

Placement of modifiers. Modifiers need to have a clear, direct relationship with the word/s that they modify. The title of Rob Lowe’s autobiography should be Stories I Tell Only My Friends, not Stories I Only Tell My Friends. However, I’ll leave Brian Wilson alone with “God only knows what I’d be without you”, though I know that he meant “Only God knows what I’d be without you”.

And how amusing is this commentary, which appeared in The Times on 18 April 2015? “A longboat full of Vikings, promoting the new British Museum exhibition, was seen sailing past the Palace of Westminster yesterday. Famously uncivilised, destructive and rapacious, with an almost insatiable appetite for rough sex and heavy drinking, the MPs nevertheless looked up for a bit to admire the vessel”.

Incorrect pronouns. The irritating genteelism of “They asked Agatha and myself to dinner” and the grammatically incorrect “They asked Agatha and I to dinner”, when in both instances it should be me .

Ambiguity/obfuscation “Few Bordeaux give as much pleasure at this price”. How ethical is that on a bottle of red wine of unidentified origin?

The wrong preposition The rich are very different to you and me. (Change “to” to “from” to make sense.) Not to be mistaken with. (Change “with” to “for”). No qualms with. (Change “with” to “about”.)

Alastair Bennett/Flickr

The wrong word. There are dozens of “confusable” words that a spell checker won’t necessarily help with: “Yes, it is likely that working off campus may effect what you are trying to do”. Ironically, this could be correct, but I know that that wasn’t the writer’s intended message. And how about practice/practise, principal/principle, lead/led, and many more.

Worryingly equivocal language. After the Easter strike some time ago, the CEO of QANTAS, Alan Joyce, sent out an apologetic letter that included the sentence: “Despite some sensational coverage recently, safety was never an issue … We always respond conservatively to any mechanical or performance issue”. I hoped at the time that that’s not what he meant because I felt far from reassured by the message.

Alert readers will have noticed that I haven’t railed against poorly punctuated sentences. I’ll do that next time. A poorly punctuated sentence cannot be grammatically correct.

The Conversation

Roslyn Petelin, Associate Professor in Writing, The University of Queensland

This article was originally published on The Conversation. Read the original article.

Advertisements

Beware the bad big wolf: why you need to put your adjectives in the right order


image-20160906-25260-dcj9cp.jpg

Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

 

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

Rules, rules, rules

Nevertheless, however many adjectives we choose to employ, they do indeed tend to follow a predictable pattern. While native speakers intuitively follow this rule, most are unaware that they are doing so; we agree that the “red big dog” sounds wrong, but don’t know why. In order to test this intuition linguists have analysed large corpora of electronic data, to see how frequently pairs of adjectives like “big red” are preferred to “red big”. The results confirm our native intuition, although the figures are not as comprehensive as we might expect – the rule accounts for 78% of the data.

We know how to use them … without even being aware of it.
Shutterstock

But while linguists have been able to confirm that there are strong preferences in the ordering of pairs of adjectives, no such statistics have been produced for longer strings. Consequently, while Forsyth’s rule appears to make sense, it remains an untested, hypothetical, large, sweeping (sorry) claim.

In fact, even if we stick to just two adjectives it is possible to find examples that appear to break the rule. The “big bad wolf” of fairy tale, for instance, shows the size adjective preceding the opinion one; similarly, “big stupid” is more common than “stupid big”. Examples like these are instead witness to the “Polyanna Principle”, by which speakers prefer to present positive, or indifferent, values before negative ones.

Another consideration of Forsyth’s proposed ordering sequence is that it makes no reference to other constraints that influence adjective order, such as when we use two adjectives that fall into the same category. Little Richard’s song “Long Tall Sally” would have sounded strange if he had called it Tall Long Sally, but these are both adjectives of size.

Definitely not Tall Long Sally.

Similarly, we might describe a meal as “nice and spicy” but never “spicy and nice” – reflecting a preference for the placement of general opinions before more specific ones. We also need to bear in mind the tendency for noun phrases to become lexicalised – forming words in their own right. Just as a blackbird is not any kind of bird that is black, a little black dress does not refer to any small black dress but one that is suitable for particular kinds of social engagement.

Since speakers view a “little black dress” as a single entity, its order is fixed; as a result, modifying adjectives must precede little – a “polyester little black dress”. This means that an adjective specifying its material appears before those referring to size and colour, once again contravening Forsyth’s rule.

Making sense of language

Of course, the rule is a fair reflection of much general usage – although the reasons behind this complex set of constraints in adjective order remain disputed. Some linguists have suggested that it reflects the “nouniness” of an adjective; since colour adjectives are commonly used as nouns – “red is my favourite colour” – they appear close to that slot.

Another conditioning factor may be the degree to which an adjective reflects a subjective opinion rather than an objective description – therefore, subjective adjectives that are harder to quantify (boring, massive, middle-aged) tend to appear further away from the noun than more concrete ones (red, round, French).

Prosody, the rhythm and sound of poetry, is likely to play a role, too – as there is a tendency for speakers to place longer adjectives after shorter ones. But probably the most compelling theory links adjective position with semantic closeness to the noun being described; adjectives that are closely related to the noun in meaning, and are therefore likely to appear frequently in combination with it, are placed closest, while those that are less closely related appear further away.

In Forsyth’s example, it is the knife’s whittling capabilities that are most significant – distinguishing it from a carving, fruit or butter knife – while its loveliness is hardest to define (what are the standards for judging the loveliness of a whittling knife?) and thus most subjective. Whether any slight reorganisation of the other adjectives would really prompt your friends to view you as a knife-wielding maniac is harder to determine – but then, at least it’s just a whittling knife.

The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

How the Queen’s English has had to defer to Africa’s rich multilingualism


Rajend Mesthrie, University of Cape Town

For the first time in history a truly global language has emerged. English enables international communication par excellence, with a far wider reach than other possible candidates for this position – like Latin in the past, and French, Spanish and Mandarin in the present.

In a memorable phrase, former Tanzanian statesman Julius Nyerere once characterised English as the Kiswahili of the world. In Africa, English is more widely spoken than other important lingua francas like Kiswahili, Arabic, French and Portuguese, with at least 26 countries using English as one of their official languages.

But English in Africa comes in many different shapes and forms. It has taken root in an exceptionally multilingual context, with well over a thousand languages spoken on the continent. The influence of this multilingualism tends to be largely erased at the most formal levels of use – for example, in the national media and in higher educational contexts. But at an everyday level, the Queen’s English has had to defer to the continent’s rich abundance of languages. Pidgin, creole, second-language and first-language English all flourish alongside them.

The birth of new languages

English did not enter Africa as an innocent language. Its history is tied up with trade and exploitation, capitalist expansion, slavery and colonisation.

The history of English is tied up with trade, capitalist expansion, slavery and colonialism.
Shutterstock

As the need for communication arose and increased under these circumstances, forms of English, known as pidgins and creoles, developed. This took place within a context of unequal encounters, a lack of sustained contact with speakers of English and an absence of formal education. Under these conditions, English words were learnt and attached to an emerging grammar that owed more to African languages than to English.

A pidgin is defined by linguists as an initially simple form of communication that arises from contact between speakers of disparate languages who have
no other means of communication in common. Pidgins, therefore, do not have mother-tongue speakers. The existence of pidgins in the early period of West African-European contact is not well documented, and some linguists like Salikoko Mufwene judge their early significance to be overestimated.

Pidgins can become more complex if they take on new functions. They are relabelled creoles if, over time and under specific circumstances, they become fully developed as the first language of a group of speakers.

Ultimately, pidgins and creoles develop grammatical norms that are far removed from the colonial forms that partially spawned them: to a British English speaker listening to a pidgin or creole, the words may seem familiar in form, but not always in meaning.

Linguists pay particular attention to these languages because they afford them the opportunity to observe creativity at first hand: the birth of new languages.

The creoles of West Africa

West Africa’s creoles are of two types: those that developed outside Africa; and those that first developed from within the continent.

The West African creoles that developed outside Africa emerged out of the multilingual and oppressive slave experience in the New World. They were then brought to West Africa after 1787 by freed slaves repatriated from Britain, North America and the Caribbean. “Krio” was the name given to the English-based creole of slaves freed from Britain who were returned to Sierra Leone, where they were joined by slaves released from Nova Scotia and Jamaica.

Some years after that, in 1821, Liberia was established as an African homeland for freed slaves from the US. These men and women brought with them what some linguists call “Liberian settler English”. This particular creole continues to make Liberia somewhat special on the continent, with American rather than British forms of English dominating there.

These languages from the New World were very influential in their new environments, especially over the developing West African pidgin English.

A more recent, homegrown type of West African creole has emerged in the region. This West African creole is spreading in the context of urban multilingualism and changing youth identities. Over the past 50 years, it has grown spectacularly in Ghana, Cameroon, Equatorial Guinea and Sierra Leone, and it is believed to be the fastest-growing language in Nigeria. In this process pidgin English has been expanded into a creole, used as one of the languages of the home. For such speakers, the designation “pidgin” is now a misnomer, although it remains widely used.

In East Africa, in contrast, the strength and historicity of Kiswahili as a lingua franca prevented the rapid development of pidgins based on colonial languages. There, traders and colonists had to learn Kiswahili for successful everyday communication. This gave locals more time to master English as a fully-fledged second language.

Other varieties of English

Africa, mirroring the trend in the rest of the world, has a large and increasing number of second-language English speakers. Second-language varieties of English are mutually intelligible with first-language versions, while showing varying degrees of difference in accent, grammar and nuance of vocabulary. Formal colonisation and the educational system from the 19th century onwards account for the wide spread of second-language English.

What about first-language varieties of English on the continent? The South African variety looms large in this history, showing similarities with English in Australia and New Zealand, especially in details of accent.

In post-apartheid South Africa many young black people from middle-class backgrounds now speak this variety either as a dominant language or as a “second first-language”. But for most South Africans English is a second language – a very important one for education, business and international communication.

For family and cultural matters, African languages remain of inestimable value throughout the continent.

The Conversation

Rajend Mesthrie, Professor of Linguistics, University of Cape Town

This article was originally published on The Conversation. Read the original article.

Why it’s hard for adults to learn a second language


image-20160804-473-32tg9n.jpg

Brianna Yamasaki, University of Washington

As a young adult in college, I decided to learn Japanese. My father’s family is from Japan, and I wanted to travel there someday.

However, many of my classmates and I found it difficult to learn a language in adulthood. We struggled to connect new sounds and a dramatically different writing system to the familiar objects around us.

It wasn’t so for everyone. There were some students in our class who were able to acquire the new language much more easily than others.

So, what makes some individuals “good language learners?” And do such individuals have a “second language aptitude?”

What we know about second language aptitude

Past research on second language aptitude has focused on how people perceive sounds in a particular language and on more general cognitive processes such as memory and learning abilities. Most of this work has used paper-and-pencil and computerized tests to determine language-learning abilities and predict future learning.

Researchers have also studied brain activity as a way of measuring linguistic and cognitive abilities. However, much less is known about how brain activity predicts second language learning.

Is there a way to predict the aptitude of second language learning?

How does brain activity change while learning languages?
Brain image via www.shutterstock.com

In a recently published study, Chantel Prat, associate professor of psychology at the Institute for Learning and Brain Sciences at the University of Washington, and I explored how brain activity recorded at rest – while a person is relaxed with their eyes closed – could predict the rate at which a second language is learned among adults who spoke only one language.

Studying the resting brain

Resting brain activity is thought to reflect the organization of the brain and it has been linked to intelligence, or the general ability used to reason and problem-solve.

We measured brain activity obtained from a “resting state” to predict individual differences in the ability to learn a second language in adulthood.

To do that, we recorded five minutes of eyes-closed resting-state electroencephalography, a method that detects electrical activity in the brain, in young adults. We also collected two hours of paper-and-pencil and computerized tasks.

We then had 19 participants complete eight weeks of French language training using a computer program. This software was developed by the U.S. armed forces with the goal of getting military personnel functionally proficient in a language as quickly as possible.

The software combined reading, listening and speaking practice with game-like virtual reality scenarios. Participants moved through the content in levels organized around different goals, such as being able to communicate with a virtual cab driver by finding out if the driver was available, telling the driver where their bags were and thanking the driver.

Here’s a video demonstration:

Nineteen adult participants (18-31 years of age) completed two 30-minute training sessions per week for a total of 16 sessions. After each training session, we recorded the level that each participant had reached. At the end of the experiment, we used that level information to calculate each individual’s learning rate across the eight-week training.

As expected, there was large variability in the learning rate, with the best learner moving through the program more than twice as quickly as the slowest learner. Our goal was to figure out which (if any) of the measures recorded initially predicted those differences.

A new brain measure for language aptitude

When we correlated our measures with learning rate, we found that patterns of brain activity that have been linked to linguistic processes predicted how easily people could learn a second language.

Patterns of activity over the right side of the brain predicted upwards of 60 percent of the differences in second language learning across individuals. This finding is consistent with previous research showing that the right half of the brain is more frequently used with a second language.

Our results suggest that the majority of the language learning differences between participants could be explained by the way their brain was organized before they even started learning.

Implications for learning a new language

Does this mean that if you, like me, don’t have a “quick second language learning” brain you should forget about learning a second language?

Not quite.

Language learning can depend on many factors.
Child image via www.shutterstock.com

First, it is important to remember that 40 percent of the difference in language learning rate still remains unexplained. Some of this is certainly related to factors like attention and motivation, which are known to be reliable predictors of learning in general, and of second language learning in particular.

Second, we know that people can change their resting-state brain activity. So training may help to shape the brain into a state in which it is more ready to learn. This could be an exciting future research direction.

Second language learning in adulthood is difficult, but the benefits are large for those who, like myself, are motivated by the desire to communicate with others who do not speak their native tongue.

The Conversation

Brianna Yamasaki, Ph.D. Student, University of Washington

This article was originally published on The Conversation. Read the original article.

British Council Backs Bilingual Babies


3930162_orig

The British Council is to open a bilingual pre-school in Hong Kong in August. The International Pre-School, which will teach English and Cantonese and have specific times set aside for Mandarin, will follow the UK-based International Primary Curriculum.

The British Council already has bilingual pre-schools in Singapore (pictured above) and Madrid. The adoption of a bilingual model of early years learning, rather than a purely English-medium one, is supported by much of the research on this age group. In a randomised control trial in the US state of New Jersey, for example, three- and four-year-olds from both Spanish- and English-speaking backgrounds were assigned by lottery to either an all-English or English–Spanish pre-school programme which used an identical curriculum. The study found that children from the bilingual programme emerged with the same level of English as those in the English-medium one, but both the Spanish-speaking and anglophone children had a much higher level of Spanish.

http://www.elgazette.com/item/281-british-council-backs-bilingual-babies.html

How the British military became a champion for language learning


education-military.jpg

Wendy Ayres-Bennett, University of Cambridge

When an army deploys in a foreign country, there are clear advantages if the soldiers are able to speak the local language or dialect. But what if your recruits are no good at other languages? In the UK, where language learning in schools and universities is facing a real crisis, the British army began to see this as a serious problem.

In a new report on the value of languages, my colleagues and I showcased how a new language policy instituted last year within the British Army, was triggered by a growing appreciation of the risks of language shortages for national security.

Following the conflicts in Iraq and Afghanistan, the military sought to implement language skills training as a core competence. Speakers of other languages are encouraged to take examinations to register their language skills, whether they are language learners or speakers of heritage or community languages.

The UK Ministry of Defence’s Defence Centre for Language and Culture also offers training to NATO standards across the four language skills – listening, speaking, reading and writing. Core languages taught are Arabic, Dari, Farsi, French, Russian, Spanish and English as a foreign language. Cultural training that provides regional knowledge and cross-cultural skills is still embryonic, but developing fast.

Cash incentives

There are two reasons why this is working. The change was directed by the vice chief of the defence staff, and therefore had a high-level champion. There are also financial incentives for army personnel to have their linguistic skills recorded, ranging from £360 for a lower-level western European language, to £11,700 for a high level, operationally vital linguist. Currently any army officer must have a basic language skill to be able to command a sub unit.

A British army sergeant visits a school in Helmand, Afghanistan.
Defence Images/flickr.com, CC BY-NC

We should not, of course, overstate the progress made. The numbers of Ministry of Defence linguists for certain languages, including Arabic, are still precariously low and, according to recent statistics, there are no speakers of Ukrainian or Estonian classed at level three or above in the armed forces. But, crucially, the organisational culture has changed and languages are now viewed as an asset.

Too fragmented

The British military’s new approach is a good example of how an institution can change the culture of the way it thinks about languages. It’s also clear that language policy can no longer simply be a matter for the Department for Education: champions for language both within and outside government are vital for issues such as national security.

This is particularly important because of the fragmentation of language learning policy within the UK government, despite an informal cross-Whitehall language focus group.

Experience on the ground illustrates the value of cooperation when it comes to security. For example, in January, the West Midlands Counter Terrorism Unit urgently needed a speaker of a particular language dialect to assist with translating communications in an ongoing investigation. The MOD was approached and was able to source a speaker within another department.

There is a growing body of research demonstrating the cost to business of the UK’s lack of language skills. Much less is known about their value to national security, defence and diplomacy, conflict resolution and social cohesion. Yet language skills have to be seen as an asset, and appreciation is needed across government for their wider value to society and security.

The Conversation

Wendy Ayres-Bennett, Professor of French Philology and Linguistics, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Britain may be leaving the EU, but English is going nowhere


image-20160701-18331-1oy1oep

Andrew Linn, University of Westminster

After Brexit, there are various things that some in the EU hope to see and hear less in the future. One is Nigel Farage. Another is the English language.

In the early hours of June 24, as the referendum outcome was becoming clear, Jean-Luc Mélenchon, left-wing MEP and French presidential candidate, tweeted that “English cannot be the third working language of the European parliament”.

This is not the first time that French and German opinion has weighed in against alleged disproportionate use of English in EU business. In 2012, for example, a similar point was made about key eurozone recommendations from the European Commission being published initially “in a language which [as far as the Euro goes] is only spoken by less than 5m Irish”. With the number of native speakers of English in the EU set to drop from 14% to around 1% of the bloc’s total with the departure of the UK, this point just got a bit sharper.

Translation overload

Official EU language policy is multilingualism with equal rights for all languages used in member states. It recommends that “every European citizen should master two other languages in addition to their mother tongue” – Britain’s abject failure to achieve this should make it skulk away in shame.

The EU recognises 24 “official and working” languages, a number that has mushroomed from the original four (Dutch, French, German and Italian) as more countries have joined. All EU citizens have a right to access EU documents in any of those languages. This calls for a translation team numbering around 2,500, not to mention a further 600 full-time interpreters. In practice most day-to-day business is transacted in either English, French or German and then translated, but it is true that English dominates to a considerable extent.

Lots of work still to do.
Etienne Ansotte/EPA

The preponderance of English has nothing to do with the influence of Britain or even Britain’s membership of the EU. Historically, the expansion of the British empire, the impact of the industrial revolution and the emergence of the US as a world power have embedded English in the language repertoire of speakers across the globe.

Unlike Latin, which outlived the Roman empire as the lingua franca of medieval and renaissance Europe, English of course has native speakers (who may be unfairly advantaged), but it is those who have learned English as a foreign language – “Euro-English” or “English as a lingua franca” – who now constitute the majority of users.

According to the 2012 Special Eurobarometer on Europeans and their Languages, English is the most widely spoken foreign language in 19 of the member states where it is not an official language. Across Europe, 38% of people speak English well enough as a foreign language to have a conversation, compared to 12% speaking French and 11% in German.

The report also found that 67% of Europeans consider English the most useful foreign language, and that the numbers favouring German (17%) or French (16%) have declined. As a result, 79% of Europeans want their children to learn English, compared to 20% for French and German.

Too much invested in English

Huge sums have been invested in English teaching by both national governments and private enterprise. As the demand for learning English has increased, so has the supply. English language learning worldwide was estimated to be worth US$63.3 billion (£47.5 billion) in 2012, and it is expected that this market will rise to US$193.2 billion (£145.6 billion) by 2017. The value of English for speakers of other languages is not going to diminish any time soon. There is simply too much invested in it.

Speakers of English as a second language outnumber first-language English speakers by 2:1 both in Europe and globally. For many Europeans, and especially those employed in the EU, English is a useful piece in a toolbox of languages to be pressed into service when needed – a point which was evident in a recent project on whether the use of English in Europe was an opportunity or a threat. So in the majority of cases using English has precisely nothing to do with the UK or Britishness. The EU needs practical solutions and English provides one.

English is unchallenged as the lingua franca of Europe. It has even been suggested that in some countries of northern Europe it has become a second rather than a foreign language. Jan Paternotte, D66 party leader in Amsterdam, has proposed that English should be decreed the official second language of that city.

English has not always held its current privileged status. French and German have both functioned as common languages for high-profile fields such as philosophy, science and technology, politics and diplomacy, not to mention Church Slavonic, Russian, Portuguese and other languages in different times and places.

We can assume that English will not maintain its privileged position forever. Who benefits now, however, are not the predominantly monolingual British, but European anglocrats whose multilingualism provides them with a key to international education and employment.

Much about the EU may be about to change, but right now an anti-English language policy so dramatically out of step with practice would simply make the post-Brexit hangover more painful.

The Conversation

Andrew Linn, Pro-Vice-Chancellor and Dean of Social Sciences and Humanities, University of Westminster

This article was originally published on The Conversation. Read the original article.