British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

Advertisements

Crimes of grammar and other writing misdemeanours


fountain-pen-on-paper

Roslyn Petelin, The University of Queensland

Writing an article like this is just asking for trouble. Already, I can hear one reader asking “Why do you need just?” Another suggesting that like should be replaced by such as. And yet another saying “fancy using a cliché like asking for trouble!”

Another will mutter: “Where’s your evidence?”

My evidence lies in the vehement protestations that I face when going through solutions to an editing test or grammar quiz with on-campus students in my writing courses at The University of Queensland, and no, that’s not deferential capitalisation. It is capital ‘T’.

Confirming evidence lies in the querulous discussion-board posts from dozens of students when they see the answers to quizzes on the English Grammar and Style massive open online course that I designed.

Katie Krueger/Flickr

Further evidence lies in the fervour with which people comment about articles such as the one that you are currently reading. For instance, a 2013 article 10 grammar rules you can forget: How to stop worrying and write proper by the style editor of The Guardian, David Marsh, prompted 956 comments. Marsh loves breaking “real” rules. The title of his recent book is For Who the Bell Tolls. I’d prefer properly to proper and whom to who, but not everybody else would.

Marsh’s 10 forgettable rules are ones that my favourite grammarian, Professor Geoffrey Pullum, co-author of The Cambridge Grammar of the English Language calls zombie rules: “though dead, they shamble mindlessly on”. A list of zombie rules invariably includes never beginning a sentence with “and”, “but”, or “because”, as well as the strictures that are a hangover from Latin: never split an infinitive and never end a sentence with a preposition. It (should it be they?) couldn’t be done in Latin, but it (they?) can be done in English. Just covering my bases here.

So, what’s my stance on adhering to Standard English? I’m certainly not a grammar Nazi, nor even a grammando, a portmanteau term that first appeared in The New York Times in 2012 that’s hardly any softer. Am I a vigilante, a pedant, a per(s)nickety person? Am I a snoot? Snoot is the acronym that the late David Foster Wallace and his mother — both English teachers — coined from Sprachgefühl Necessitates Our Ongoing Tendance or, for those with neither German nor a cache of obsolete words in their vocabulary, Syntax Nudniks of Our Time.

David Foster Wallace
yoosi barzilai Flickr

Foster Wallace reserves snoot for a “really extreme usage fanatic”, the sort of person whose idea of Sunday fun would have been to find mistakes in the late William Safire’s On Language column in the New York Times magazine. Safire was a style maven who wrote articles with intriguing opening lines such as this: “A sinister force for solecism exists on Madison Avenue. It is the work of the copywrongers”.

Growing up with a mother who would stage a “pretend” coughing fit when her children made a grammar error clearly contributed to Foster Wallace’s SNOOTitude. His 50-page essay “Authority and American Usage”, published in 2005, constitutes a brilliant, if somewhat eccentric, coverage of English grammar.

I need to be a bit of a snoot because part of my brief as a writing educator is to prepare graduates for their utilitarian need to function as writing workers in a writing-reliant workplace where professional standards are crucial and errors erode credibility. (I see the other part of my brief as fostering a love of language that will provide them with lifelong recreational pleasure.)

How do I teach students to avoid grammar errors, ambiguous syntax, and infelicities and gaucheries in style? In the closing chapter of my new book on effective writing, I list around 80 potential problems in grammar, punctuation, style, and syntax.

My hateful eight

My brief for this article is to highlight eight of these problems. Should I identify ones that peeve me the most or ones that cause most dissonance for readers? What’s the peevishness threshold of readers of The Conversation? Let’s go with mine, for now; they may also be yours. They are in no particular order and they depend on the writing context in which they are set: academic, corporate, creative, or journalistic.

Archaic language: amongst, whilst. Replace them with among and while.

Resistance to the singular “they” Here’s an unbearably tedious example from a book published in 2016 in London: “The four victims each found a small book like this in his or her home, or among his or her possessions, several weeks before the murder occurred in each case”. Replace his or her with their.

In January this year, The American Dialect Society announced the singular “they” as their Word of the Year for 2015, decades after Australia welcomed and widely adopted it.

Placement of modifiers. Modifiers need to have a clear, direct relationship with the word/s that they modify. The title of Rob Lowe’s autobiography should be Stories I Tell Only My Friends, not Stories I Only Tell My Friends. However, I’ll leave Brian Wilson alone with “God only knows what I’d be without you”, though I know that he meant “Only God knows what I’d be without you”.

And how amusing is this commentary, which appeared in The Times on 18 April 2015? “A longboat full of Vikings, promoting the new British Museum exhibition, was seen sailing past the Palace of Westminster yesterday. Famously uncivilised, destructive and rapacious, with an almost insatiable appetite for rough sex and heavy drinking, the MPs nevertheless looked up for a bit to admire the vessel”.

Incorrect pronouns. The irritating genteelism of “They asked Agatha and myself to dinner” and the grammatically incorrect “They asked Agatha and I to dinner”, when in both instances it should be me .

Ambiguity/obfuscation “Few Bordeaux give as much pleasure at this price”. How ethical is that on a bottle of red wine of unidentified origin?

The wrong preposition The rich are very different to you and me. (Change “to” to “from” to make sense.) Not to be mistaken with. (Change “with” to “for”). No qualms with. (Change “with” to “about”.)

Alastair Bennett/Flickr

The wrong word. There are dozens of “confusable” words that a spell checker won’t necessarily help with: “Yes, it is likely that working off campus may effect what you are trying to do”. Ironically, this could be correct, but I know that that wasn’t the writer’s intended message. And how about practice/practise, principal/principle, lead/led, and many more.

Worryingly equivocal language. After the Easter strike some time ago, the CEO of QANTAS, Alan Joyce, sent out an apologetic letter that included the sentence: “Despite some sensational coverage recently, safety was never an issue … We always respond conservatively to any mechanical or performance issue”. I hoped at the time that that’s not what he meant because I felt far from reassured by the message.

Alert readers will have noticed that I haven’t railed against poorly punctuated sentences. I’ll do that next time. A poorly punctuated sentence cannot be grammatically correct.

The Conversation

Roslyn Petelin, Associate Professor in Writing, The University of Queensland

This article was originally published on The Conversation. Read the original article.

Beware the bad big wolf: why you need to put your adjectives in the right order


image-20160906-25260-dcj9cp.jpg

Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

 

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

Rules, rules, rules

Nevertheless, however many adjectives we choose to employ, they do indeed tend to follow a predictable pattern. While native speakers intuitively follow this rule, most are unaware that they are doing so; we agree that the “red big dog” sounds wrong, but don’t know why. In order to test this intuition linguists have analysed large corpora of electronic data, to see how frequently pairs of adjectives like “big red” are preferred to “red big”. The results confirm our native intuition, although the figures are not as comprehensive as we might expect – the rule accounts for 78% of the data.

We know how to use them … without even being aware of it.
Shutterstock

But while linguists have been able to confirm that there are strong preferences in the ordering of pairs of adjectives, no such statistics have been produced for longer strings. Consequently, while Forsyth’s rule appears to make sense, it remains an untested, hypothetical, large, sweeping (sorry) claim.

In fact, even if we stick to just two adjectives it is possible to find examples that appear to break the rule. The “big bad wolf” of fairy tale, for instance, shows the size adjective preceding the opinion one; similarly, “big stupid” is more common than “stupid big”. Examples like these are instead witness to the “Polyanna Principle”, by which speakers prefer to present positive, or indifferent, values before negative ones.

Another consideration of Forsyth’s proposed ordering sequence is that it makes no reference to other constraints that influence adjective order, such as when we use two adjectives that fall into the same category. Little Richard’s song “Long Tall Sally” would have sounded strange if he had called it Tall Long Sally, but these are both adjectives of size.

Definitely not Tall Long Sally.

Similarly, we might describe a meal as “nice and spicy” but never “spicy and nice” – reflecting a preference for the placement of general opinions before more specific ones. We also need to bear in mind the tendency for noun phrases to become lexicalised – forming words in their own right. Just as a blackbird is not any kind of bird that is black, a little black dress does not refer to any small black dress but one that is suitable for particular kinds of social engagement.

Since speakers view a “little black dress” as a single entity, its order is fixed; as a result, modifying adjectives must precede little – a “polyester little black dress”. This means that an adjective specifying its material appears before those referring to size and colour, once again contravening Forsyth’s rule.

Making sense of language

Of course, the rule is a fair reflection of much general usage – although the reasons behind this complex set of constraints in adjective order remain disputed. Some linguists have suggested that it reflects the “nouniness” of an adjective; since colour adjectives are commonly used as nouns – “red is my favourite colour” – they appear close to that slot.

Another conditioning factor may be the degree to which an adjective reflects a subjective opinion rather than an objective description – therefore, subjective adjectives that are harder to quantify (boring, massive, middle-aged) tend to appear further away from the noun than more concrete ones (red, round, French).

Prosody, the rhythm and sound of poetry, is likely to play a role, too – as there is a tendency for speakers to place longer adjectives after shorter ones. But probably the most compelling theory links adjective position with semantic closeness to the noun being described; adjectives that are closely related to the noun in meaning, and are therefore likely to appear frequently in combination with it, are placed closest, while those that are less closely related appear further away.

In Forsyth’s example, it is the knife’s whittling capabilities that are most significant – distinguishing it from a carving, fruit or butter knife – while its loveliness is hardest to define (what are the standards for judging the loveliness of a whittling knife?) and thus most subjective. Whether any slight reorganisation of the other adjectives would really prompt your friends to view you as a knife-wielding maniac is harder to determine – but then, at least it’s just a whittling knife.

The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

How the Queen’s English has had to defer to Africa’s rich multilingualism


Rajend Mesthrie, University of Cape Town

For the first time in history a truly global language has emerged. English enables international communication par excellence, with a far wider reach than other possible candidates for this position – like Latin in the past, and French, Spanish and Mandarin in the present.

In a memorable phrase, former Tanzanian statesman Julius Nyerere once characterised English as the Kiswahili of the world. In Africa, English is more widely spoken than other important lingua francas like Kiswahili, Arabic, French and Portuguese, with at least 26 countries using English as one of their official languages.

But English in Africa comes in many different shapes and forms. It has taken root in an exceptionally multilingual context, with well over a thousand languages spoken on the continent. The influence of this multilingualism tends to be largely erased at the most formal levels of use – for example, in the national media and in higher educational contexts. But at an everyday level, the Queen’s English has had to defer to the continent’s rich abundance of languages. Pidgin, creole, second-language and first-language English all flourish alongside them.

The birth of new languages

English did not enter Africa as an innocent language. Its history is tied up with trade and exploitation, capitalist expansion, slavery and colonisation.

The history of English is tied up with trade, capitalist expansion, slavery and colonialism.
Shutterstock

As the need for communication arose and increased under these circumstances, forms of English, known as pidgins and creoles, developed. This took place within a context of unequal encounters, a lack of sustained contact with speakers of English and an absence of formal education. Under these conditions, English words were learnt and attached to an emerging grammar that owed more to African languages than to English.

A pidgin is defined by linguists as an initially simple form of communication that arises from contact between speakers of disparate languages who have
no other means of communication in common. Pidgins, therefore, do not have mother-tongue speakers. The existence of pidgins in the early period of West African-European contact is not well documented, and some linguists like Salikoko Mufwene judge their early significance to be overestimated.

Pidgins can become more complex if they take on new functions. They are relabelled creoles if, over time and under specific circumstances, they become fully developed as the first language of a group of speakers.

Ultimately, pidgins and creoles develop grammatical norms that are far removed from the colonial forms that partially spawned them: to a British English speaker listening to a pidgin or creole, the words may seem familiar in form, but not always in meaning.

Linguists pay particular attention to these languages because they afford them the opportunity to observe creativity at first hand: the birth of new languages.

The creoles of West Africa

West Africa’s creoles are of two types: those that developed outside Africa; and those that first developed from within the continent.

The West African creoles that developed outside Africa emerged out of the multilingual and oppressive slave experience in the New World. They were then brought to West Africa after 1787 by freed slaves repatriated from Britain, North America and the Caribbean. “Krio” was the name given to the English-based creole of slaves freed from Britain who were returned to Sierra Leone, where they were joined by slaves released from Nova Scotia and Jamaica.

Some years after that, in 1821, Liberia was established as an African homeland for freed slaves from the US. These men and women brought with them what some linguists call “Liberian settler English”. This particular creole continues to make Liberia somewhat special on the continent, with American rather than British forms of English dominating there.

These languages from the New World were very influential in their new environments, especially over the developing West African pidgin English.

A more recent, homegrown type of West African creole has emerged in the region. This West African creole is spreading in the context of urban multilingualism and changing youth identities. Over the past 50 years, it has grown spectacularly in Ghana, Cameroon, Equatorial Guinea and Sierra Leone, and it is believed to be the fastest-growing language in Nigeria. In this process pidgin English has been expanded into a creole, used as one of the languages of the home. For such speakers, the designation “pidgin” is now a misnomer, although it remains widely used.

In East Africa, in contrast, the strength and historicity of Kiswahili as a lingua franca prevented the rapid development of pidgins based on colonial languages. There, traders and colonists had to learn Kiswahili for successful everyday communication. This gave locals more time to master English as a fully-fledged second language.

Other varieties of English

Africa, mirroring the trend in the rest of the world, has a large and increasing number of second-language English speakers. Second-language varieties of English are mutually intelligible with first-language versions, while showing varying degrees of difference in accent, grammar and nuance of vocabulary. Formal colonisation and the educational system from the 19th century onwards account for the wide spread of second-language English.

What about first-language varieties of English on the continent? The South African variety looms large in this history, showing similarities with English in Australia and New Zealand, especially in details of accent.

In post-apartheid South Africa many young black people from middle-class backgrounds now speak this variety either as a dominant language or as a “second first-language”. But for most South Africans English is a second language – a very important one for education, business and international communication.

For family and cultural matters, African languages remain of inestimable value throughout the continent.

The Conversation

Rajend Mesthrie, Professor of Linguistics, University of Cape Town

This article was originally published on The Conversation. Read the original article.

Why it’s hard for adults to learn a second language


image-20160804-473-32tg9n.jpg

Brianna Yamasaki, University of Washington

As a young adult in college, I decided to learn Japanese. My father’s family is from Japan, and I wanted to travel there someday.

However, many of my classmates and I found it difficult to learn a language in adulthood. We struggled to connect new sounds and a dramatically different writing system to the familiar objects around us.

It wasn’t so for everyone. There were some students in our class who were able to acquire the new language much more easily than others.

So, what makes some individuals “good language learners?” And do such individuals have a “second language aptitude?”

What we know about second language aptitude

Past research on second language aptitude has focused on how people perceive sounds in a particular language and on more general cognitive processes such as memory and learning abilities. Most of this work has used paper-and-pencil and computerized tests to determine language-learning abilities and predict future learning.

Researchers have also studied brain activity as a way of measuring linguistic and cognitive abilities. However, much less is known about how brain activity predicts second language learning.

Is there a way to predict the aptitude of second language learning?

How does brain activity change while learning languages?
Brain image via www.shutterstock.com

In a recently published study, Chantel Prat, associate professor of psychology at the Institute for Learning and Brain Sciences at the University of Washington, and I explored how brain activity recorded at rest – while a person is relaxed with their eyes closed – could predict the rate at which a second language is learned among adults who spoke only one language.

Studying the resting brain

Resting brain activity is thought to reflect the organization of the brain and it has been linked to intelligence, or the general ability used to reason and problem-solve.

We measured brain activity obtained from a “resting state” to predict individual differences in the ability to learn a second language in adulthood.

To do that, we recorded five minutes of eyes-closed resting-state electroencephalography, a method that detects electrical activity in the brain, in young adults. We also collected two hours of paper-and-pencil and computerized tasks.

We then had 19 participants complete eight weeks of French language training using a computer program. This software was developed by the U.S. armed forces with the goal of getting military personnel functionally proficient in a language as quickly as possible.

The software combined reading, listening and speaking practice with game-like virtual reality scenarios. Participants moved through the content in levels organized around different goals, such as being able to communicate with a virtual cab driver by finding out if the driver was available, telling the driver where their bags were and thanking the driver.

Here’s a video demonstration:

Nineteen adult participants (18-31 years of age) completed two 30-minute training sessions per week for a total of 16 sessions. After each training session, we recorded the level that each participant had reached. At the end of the experiment, we used that level information to calculate each individual’s learning rate across the eight-week training.

As expected, there was large variability in the learning rate, with the best learner moving through the program more than twice as quickly as the slowest learner. Our goal was to figure out which (if any) of the measures recorded initially predicted those differences.

A new brain measure for language aptitude

When we correlated our measures with learning rate, we found that patterns of brain activity that have been linked to linguistic processes predicted how easily people could learn a second language.

Patterns of activity over the right side of the brain predicted upwards of 60 percent of the differences in second language learning across individuals. This finding is consistent with previous research showing that the right half of the brain is more frequently used with a second language.

Our results suggest that the majority of the language learning differences between participants could be explained by the way their brain was organized before they even started learning.

Implications for learning a new language

Does this mean that if you, like me, don’t have a “quick second language learning” brain you should forget about learning a second language?

Not quite.

Language learning can depend on many factors.
Child image via www.shutterstock.com

First, it is important to remember that 40 percent of the difference in language learning rate still remains unexplained. Some of this is certainly related to factors like attention and motivation, which are known to be reliable predictors of learning in general, and of second language learning in particular.

Second, we know that people can change their resting-state brain activity. So training may help to shape the brain into a state in which it is more ready to learn. This could be an exciting future research direction.

Second language learning in adulthood is difficult, but the benefits are large for those who, like myself, are motivated by the desire to communicate with others who do not speak their native tongue.

The Conversation

Brianna Yamasaki, Ph.D. Student, University of Washington

This article was originally published on The Conversation. Read the original article.

British Council Backs Bilingual Babies


3930162_orig

The British Council is to open a bilingual pre-school in Hong Kong in August. The International Pre-School, which will teach English and Cantonese and have specific times set aside for Mandarin, will follow the UK-based International Primary Curriculum.

The British Council already has bilingual pre-schools in Singapore (pictured above) and Madrid. The adoption of a bilingual model of early years learning, rather than a purely English-medium one, is supported by much of the research on this age group. In a randomised control trial in the US state of New Jersey, for example, three- and four-year-olds from both Spanish- and English-speaking backgrounds were assigned by lottery to either an all-English or English–Spanish pre-school programme which used an identical curriculum. The study found that children from the bilingual programme emerged with the same level of English as those in the English-medium one, but both the Spanish-speaking and anglophone children had a much higher level of Spanish.

http://www.elgazette.com/item/281-british-council-backs-bilingual-babies.html

How the British military became a champion for language learning


education-military.jpg

Wendy Ayres-Bennett, University of Cambridge

When an army deploys in a foreign country, there are clear advantages if the soldiers are able to speak the local language or dialect. But what if your recruits are no good at other languages? In the UK, where language learning in schools and universities is facing a real crisis, the British army began to see this as a serious problem.

In a new report on the value of languages, my colleagues and I showcased how a new language policy instituted last year within the British Army, was triggered by a growing appreciation of the risks of language shortages for national security.

Following the conflicts in Iraq and Afghanistan, the military sought to implement language skills training as a core competence. Speakers of other languages are encouraged to take examinations to register their language skills, whether they are language learners or speakers of heritage or community languages.

The UK Ministry of Defence’s Defence Centre for Language and Culture also offers training to NATO standards across the four language skills – listening, speaking, reading and writing. Core languages taught are Arabic, Dari, Farsi, French, Russian, Spanish and English as a foreign language. Cultural training that provides regional knowledge and cross-cultural skills is still embryonic, but developing fast.

Cash incentives

There are two reasons why this is working. The change was directed by the vice chief of the defence staff, and therefore had a high-level champion. There are also financial incentives for army personnel to have their linguistic skills recorded, ranging from £360 for a lower-level western European language, to £11,700 for a high level, operationally vital linguist. Currently any army officer must have a basic language skill to be able to command a sub unit.

A British army sergeant visits a school in Helmand, Afghanistan.
Defence Images/flickr.com, CC BY-NC

We should not, of course, overstate the progress made. The numbers of Ministry of Defence linguists for certain languages, including Arabic, are still precariously low and, according to recent statistics, there are no speakers of Ukrainian or Estonian classed at level three or above in the armed forces. But, crucially, the organisational culture has changed and languages are now viewed as an asset.

Too fragmented

The British military’s new approach is a good example of how an institution can change the culture of the way it thinks about languages. It’s also clear that language policy can no longer simply be a matter for the Department for Education: champions for language both within and outside government are vital for issues such as national security.

This is particularly important because of the fragmentation of language learning policy within the UK government, despite an informal cross-Whitehall language focus group.

Experience on the ground illustrates the value of cooperation when it comes to security. For example, in January, the West Midlands Counter Terrorism Unit urgently needed a speaker of a particular language dialect to assist with translating communications in an ongoing investigation. The MOD was approached and was able to source a speaker within another department.

There is a growing body of research demonstrating the cost to business of the UK’s lack of language skills. Much less is known about their value to national security, defence and diplomacy, conflict resolution and social cohesion. Yet language skills have to be seen as an asset, and appreciation is needed across government for their wider value to society and security.

The Conversation

Wendy Ayres-Bennett, Professor of French Philology and Linguistics, University of Cambridge

This article was originally published on The Conversation. Read the original article.