Tough immigration laws are hitting Britain’s curry houses hard


Image 20170223 32705 au4loj
shutterstock

Emily Falconer, University of Westminster

The British curry industry is responsible for 100,000 jobs and contributes more than £4 billion to the UK economy. But it’s now feared that up to a third of all Indian restaurants could disappear because of tougher immigration laws. The Conversation

The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.

These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.

Fusion food

Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.

It’s been predicted that more than half of all curry houses may shut down within ten years.
Shutterstock

“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.

As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.

In his book Spicing Up Britain: The Multicultural History of British Food, Panayai charts the patterns of migration and the influences of food, taste and consumption habits. He follows the tastes of British Asians who have grown up with a fusion of tastes and influences all their life.

These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.

Merging of tastes

Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.

In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.

Fish and chips and curry sauce – a British speciality.
Flickr/Liz Barker, CC BY-NC

These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.

These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.

More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.

‘One spicy crab coming right up’.
Shutterstock

The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.

Multicultural Britain

But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.

As diverse as the food on our plates.
Shutterstock

This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.

Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.

Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.

Emily Falconer, Lecturer in Sociology, University of Westminster

This article was originally published on The Conversation. Read the original article.

Things you were taught at school that are wrong


Capture.JPG

Misty Adoniou, University of Canberra

Do you remember being taught you should never start your sentences with “And” or “But”?

What if I told you that your teachers were wrong and there are lots of other so-called grammar rules that we’ve probably been getting wrong in our English classrooms for years?

How did grammar rules come about?

To understand why we’ve been getting it wrong, we need to know a little about the history of grammar teaching.

Grammar is how we organise our sentences in order to communicate meaning to others.

Those who say there is one correct way to organise a sentence are called prescriptivists. Prescriptivist grammarians prescribe how sentences must be structured.

Prescriptivists had their day in the sun in the 18th century. As books became more accessible to the everyday person, prescriptivists wrote the first grammar books to tell everyone how they must write.

These self-appointed guardians of the language just made up grammar rules for English, and put them in books that they sold. It was a way of ensuring that literacy stayed out of reach of the working classes.

They took their newly concocted rules from Latin. This was, presumably, to keep literate English out of reach of anyone who wasn’t rich or posh enough to attend a grammar school, which was a school where you were taught Latin.

And yes, that is the origin of today’s grammar schools.

The other camp of grammarians are the descriptivists. They write grammar guides that describe how English is used by different people, and for different purposes. They recognise that language isn’t static, and it isn’t one-size-fits-all.

1. You can’t start a sentence with a conjunction

Let’s start with the grammatical sin I have already committed in this article. You can’t start a sentence with a conjunction.

Obviously you can, because I did. And I expect I will do it again before the end of this article. There, I knew I would!

Those who say it is always incorrect to start a sentence with a conjunction, like “and” or “but”, sit in the prescriptivist camp.

However, according to the descriptivists, at this point in our linguistic history,
it is fine to start a sentence with a conjunction in an op-ed article like this, or in a novel or a poem.

It is less acceptable to start a sentence with a conjunction in an academic journal article, or in an essay for my son’s high school economics teacher, as it turns out. But times are changing.

2. You can’t end a sentence with a preposition

Well, in Latin you can’t. In English you can, and we do all the time.

Admittedly a lot of the younger generation don’t even know what a preposition is, so this rule is already obsolete. But let’s have a look at it anyway, for old time’s sake.

According to this rule, it is wrong to say “Who did you go to the movies with?”

Instead, the prescriptivists would have me say “With whom did you go to the movies?”

I’m saving that structure for when I’m making polite chat with the Queen on my next visit to the palace.

That’s not a sarcastic comment, just a fanciful one. I’m glad I know how to structure my sentences for different audiences. It is a powerful tool. It means I usually feel comfortable in whatever social circumstances I find myself in, and I can change my writing style according to purpose and audience.

That is why we should teach grammar in schools. We need to give our children a full repertoire of language so that they can make grammatical choices that will allow them to speak and write for a wide range of audiences.

3. Put a comma when you need to take a breath

It’s a novel idea, synchronising your writing with your breathing, but the two have nothing to do with one another and if this is the instruction we give our children, it is little wonder commas are so poorly used.

Punctuation is a minefield and I don’t want to risk blowing up the internet. So here is a basic description of what commas do, and read this for a more comprehensive guide.

Commas provide demarcation between like grammatical structures. When adjectives, nouns, phrases or clauses are butting up against each other in a sentence, we separate them with a comma. That’s why I put commas between the three nouns and the two clauses in that last sentence.

Commas also provide demarcation for words, phrases or clauses that are embedded in a sentence for effect. The sentence would still be a sentence even if we took those words away. See, for example, the use of commas in this sentence.

4. To make your writing more descriptive, use more adjectives

American writer Mark Twain had it right.

“When you catch an adjective, kill it. No, I don’t mean utterly, but kill most of them – then the rest will be valuable.”

If you want your writing to be more descriptive, play with your sentence structure.

Consider this sentence from Liz Lofthouse’s beautiful children’s book Ziba came on a boat. It comes at a key turning point in the book, the story of a refugee’s escape.

“Clutching her mother’s hand, Ziba ran on and on, through the night, far away from the madness until there was only darkness and quiet.”

A beautifully descriptive sentence, and not an adjective in sight.

5. Adverbs are the words that end in ‘ly’

Lots of adverbs end in “ly”, but lots don’t.

Adverbs give more information about verbs. They tell us when, where, how and why the verb happened. So that means words like “tomorrow”, “there” and “deep” can be adverbs.

I say they can be adverbs because, actually, a word is just a word. It becomes an adverb, or a noun, or an adjective, or a verb when it is doing that job in a sentence.

Deep into the night, and the word deep is an adverb. Down a deep, dark hole and it is an adjective. When I dive into the deep, it is doing the work of a noun.

Time to take those word lists of adjectives, verbs and nouns off the classroom walls.

Time, also, to ditch those old Englishmen who wrote a grammar for their times, not ours.

If you want to understand what our language can do and how to use it well, read widely, think deeply and listen carefully. And remember, neither time nor language stands still – for any of us.

The Conversation

Misty Adoniou, Associate Professor in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.

Clear skies ahead: how improving the language of aviation could save lives


article-0-1A519FCE000005DC-280_964x629 (1).jpg

Dominique Estival, Western Sydney University

The most dangerous part of flying is driving to the airport.

That’s a standard joke among pilots, who know even better than the flying public that aviation is the safest mode of transportation.

But there are still those headlines and TV shows about airline crashes, and those statistics people like to repeat, such as:

Between 1976 and 2000, more than 1,100 passengers and crew lost their lives in accidents in which investigators determined that language had played a contributory role.

True enough, 80% of all air incidents and accidents occur because of human error. Miscommunication combined with other human factors such as fatigue, cognitive workload, noise, or forgetfulness have played a role in some of the deadliest accidents.

The most well-known, and widely discussed, is the collision on the ground of two Boeing 747 aircraft in 1977 in Tenerife, which resulted in 583 fatalities. The incident was due in part to difficult communications between the pilot, whose native language was Dutch, and the Spanish air traffic controller.

In such a high-stakes environment as commercial aviation, where the lives of hundreds of passengers and innocent people on the ground are involved, communication is critical to safety.

So, it was decided that Aviation English would be the international language of aviation and that all aviation professionals – pilots and air traffic controllers (ATC) – would need to be proficient in it. It is a language designed to minimise ambiguities and misunderstandings, highly structured and codified.

Pilots and ATC expect to hear certain bits of information in certain ways and in a given order. The “phraseology”, with its particular pronunciation (for example, “fife” and “niner” instead of “five” and “nine”, so they’re not confused with each other), specific words (“Cleared to land”), international alphabet (“Mike Hotel Foxtrot”) and strict conversation rules (you must repeat, or “read back”, an instruction), needs to be learned and practised.

In spite of globalisation and the spread of English, most people around the world are not native English speakers, and an increasing number of aviation professionals do not speak English as their first language.

Native speakers have an advantage when they learn Aviation English, since they already speak English at home and in their daily lives. But they encounter many pilots or ATC who learned English as a second or even third language.

Whose responsibility is it to ensure that communication is successful? Can native speakers simply speak the way they do at home and expect to be understood? Or do they also have the responsibility to make themselves understood and to learn how to understand pilots or ATC who are not native English speakers?

As a linguist, I analyse aviation language from a linguistics perspective. I have noted the restricted meaning of the few verbs and adjectives; that the only pronouns are “you” and sometimes “we” (“How do you read?”; “We’re overhead Camden”; how few questions there are, mostly imperatives (“Maintain heading 180”); and that the syntax is so simple (no complement clauses, no relative clauses, no recursion), it might not even count as a human language for Chomsky.

But, as a pilot and a flight instructor, I look at it from the point of view of student pilots learning to use it in the cockpit while also learning to fly the airplane and navigate around the airfield.

How much harder it is to remember what to say when the workload goes up, and more difficult to speak over the radio when you know everyone else on the frequency is listening and will notice every little mistake you make?

Imagine, then, how much more difficult this is for pilots with English as a second language.

Camden Airport.
Supplied

Everyone learning another language knows it’s suddenly more challenging to hold a conversation over the phone than face-to-face, even with someone you already know. When it’s over the radio, with someone you don’t know, against the noise of the engine, static noise in the headphones, and while trying to make the plane do what you want it to do, it can be quite daunting.

No wonder student pilots who are not native English speakers sometimes prefer to stay silent, and even some experienced native English speakers will too, when the workload is too great.

This is one of the results of my research conducted in collaboration with UNSW’s Brett Molesworth, combining linguistics and aviation human factors.

Experiments in a flight simulator with pilots of diverse language backgrounds and flying experience explored conditions likely to result in pilots making mistakes or misunderstanding ATC instructions. Not surprisingly, increased workload, too much information, and rapid ATC speech, caused mistakes.

Also not surprisingly, less experienced pilots, no matter their English proficiency, made more mistakes. But surprisingly, it was the level of training, rather than number of flying hours or language background, that predicted better communication.

Once we understand the factors contributing to miscommunication in aviation, we can propose solutions to prevent them. For example, technologies such as Automatic Speech Recognition and Natural Language Understanding may help catch errors in pilot readbacks that ATC did not notice and might complement training for pilots and ATC.

It is vital that they understand each other, whatever their native language.

The Conversation

Dominique Estival, Researcher in Linguistics, Western Sydney University

This article was originally published on The Conversation. Read the original article.

Crimes of grammar and other writing misdemeanours


fountain-pen-on-paper

Roslyn Petelin, The University of Queensland

Writing an article like this is just asking for trouble. Already, I can hear one reader asking “Why do you need just?” Another suggesting that like should be replaced by such as. And yet another saying “fancy using a cliché like asking for trouble!”

Another will mutter: “Where’s your evidence?”

My evidence lies in the vehement protestations that I face when going through solutions to an editing test or grammar quiz with on-campus students in my writing courses at The University of Queensland, and no, that’s not deferential capitalisation. It is capital ‘T’.

Confirming evidence lies in the querulous discussion-board posts from dozens of students when they see the answers to quizzes on the English Grammar and Style massive open online course that I designed.

Katie Krueger/Flickr

Further evidence lies in the fervour with which people comment about articles such as the one that you are currently reading. For instance, a 2013 article 10 grammar rules you can forget: How to stop worrying and write proper by the style editor of The Guardian, David Marsh, prompted 956 comments. Marsh loves breaking “real” rules. The title of his recent book is For Who the Bell Tolls. I’d prefer properly to proper and whom to who, but not everybody else would.

Marsh’s 10 forgettable rules are ones that my favourite grammarian, Professor Geoffrey Pullum, co-author of The Cambridge Grammar of the English Language calls zombie rules: “though dead, they shamble mindlessly on”. A list of zombie rules invariably includes never beginning a sentence with “and”, “but”, or “because”, as well as the strictures that are a hangover from Latin: never split an infinitive and never end a sentence with a preposition. It (should it be they?) couldn’t be done in Latin, but it (they?) can be done in English. Just covering my bases here.

So, what’s my stance on adhering to Standard English? I’m certainly not a grammar Nazi, nor even a grammando, a portmanteau term that first appeared in The New York Times in 2012 that’s hardly any softer. Am I a vigilante, a pedant, a per(s)nickety person? Am I a snoot? Snoot is the acronym that the late David Foster Wallace and his mother — both English teachers — coined from Sprachgefühl Necessitates Our Ongoing Tendance or, for those with neither German nor a cache of obsolete words in their vocabulary, Syntax Nudniks of Our Time.

David Foster Wallace
yoosi barzilai Flickr

Foster Wallace reserves snoot for a “really extreme usage fanatic”, the sort of person whose idea of Sunday fun would have been to find mistakes in the late William Safire’s On Language column in the New York Times magazine. Safire was a style maven who wrote articles with intriguing opening lines such as this: “A sinister force for solecism exists on Madison Avenue. It is the work of the copywrongers”.

Growing up with a mother who would stage a “pretend” coughing fit when her children made a grammar error clearly contributed to Foster Wallace’s SNOOTitude. His 50-page essay “Authority and American Usage”, published in 2005, constitutes a brilliant, if somewhat eccentric, coverage of English grammar.

I need to be a bit of a snoot because part of my brief as a writing educator is to prepare graduates for their utilitarian need to function as writing workers in a writing-reliant workplace where professional standards are crucial and errors erode credibility. (I see the other part of my brief as fostering a love of language that will provide them with lifelong recreational pleasure.)

How do I teach students to avoid grammar errors, ambiguous syntax, and infelicities and gaucheries in style? In the closing chapter of my new book on effective writing, I list around 80 potential problems in grammar, punctuation, style, and syntax.

My hateful eight

My brief for this article is to highlight eight of these problems. Should I identify ones that peeve me the most or ones that cause most dissonance for readers? What’s the peevishness threshold of readers of The Conversation? Let’s go with mine, for now; they may also be yours. They are in no particular order and they depend on the writing context in which they are set: academic, corporate, creative, or journalistic.

Archaic language: amongst, whilst. Replace them with among and while.

Resistance to the singular “they” Here’s an unbearably tedious example from a book published in 2016 in London: “The four victims each found a small book like this in his or her home, or among his or her possessions, several weeks before the murder occurred in each case”. Replace his or her with their.

In January this year, The American Dialect Society announced the singular “they” as their Word of the Year for 2015, decades after Australia welcomed and widely adopted it.

Placement of modifiers. Modifiers need to have a clear, direct relationship with the word/s that they modify. The title of Rob Lowe’s autobiography should be Stories I Tell Only My Friends, not Stories I Only Tell My Friends. However, I’ll leave Brian Wilson alone with “God only knows what I’d be without you”, though I know that he meant “Only God knows what I’d be without you”.

And how amusing is this commentary, which appeared in The Times on 18 April 2015? “A longboat full of Vikings, promoting the new British Museum exhibition, was seen sailing past the Palace of Westminster yesterday. Famously uncivilised, destructive and rapacious, with an almost insatiable appetite for rough sex and heavy drinking, the MPs nevertheless looked up for a bit to admire the vessel”.

Incorrect pronouns. The irritating genteelism of “They asked Agatha and myself to dinner” and the grammatically incorrect “They asked Agatha and I to dinner”, when in both instances it should be me .

Ambiguity/obfuscation “Few Bordeaux give as much pleasure at this price”. How ethical is that on a bottle of red wine of unidentified origin?

The wrong preposition The rich are very different to you and me. (Change “to” to “from” to make sense.) Not to be mistaken with. (Change “with” to “for”). No qualms with. (Change “with” to “about”.)

Alastair Bennett/Flickr

The wrong word. There are dozens of “confusable” words that a spell checker won’t necessarily help with: “Yes, it is likely that working off campus may effect what you are trying to do”. Ironically, this could be correct, but I know that that wasn’t the writer’s intended message. And how about practice/practise, principal/principle, lead/led, and many more.

Worryingly equivocal language. After the Easter strike some time ago, the CEO of QANTAS, Alan Joyce, sent out an apologetic letter that included the sentence: “Despite some sensational coverage recently, safety was never an issue … We always respond conservatively to any mechanical or performance issue”. I hoped at the time that that’s not what he meant because I felt far from reassured by the message.

Alert readers will have noticed that I haven’t railed against poorly punctuated sentences. I’ll do that next time. A poorly punctuated sentence cannot be grammatically correct.

The Conversation

Roslyn Petelin, Associate Professor in Writing, The University of Queensland

This article was originally published on The Conversation. Read the original article.

Why it’s hard for adults to learn a second language


image-20160804-473-32tg9n.jpg

Brianna Yamasaki, University of Washington

As a young adult in college, I decided to learn Japanese. My father’s family is from Japan, and I wanted to travel there someday.

However, many of my classmates and I found it difficult to learn a language in adulthood. We struggled to connect new sounds and a dramatically different writing system to the familiar objects around us.

It wasn’t so for everyone. There were some students in our class who were able to acquire the new language much more easily than others.

So, what makes some individuals “good language learners?” And do such individuals have a “second language aptitude?”

What we know about second language aptitude

Past research on second language aptitude has focused on how people perceive sounds in a particular language and on more general cognitive processes such as memory and learning abilities. Most of this work has used paper-and-pencil and computerized tests to determine language-learning abilities and predict future learning.

Researchers have also studied brain activity as a way of measuring linguistic and cognitive abilities. However, much less is known about how brain activity predicts second language learning.

Is there a way to predict the aptitude of second language learning?

How does brain activity change while learning languages?
Brain image via www.shutterstock.com

In a recently published study, Chantel Prat, associate professor of psychology at the Institute for Learning and Brain Sciences at the University of Washington, and I explored how brain activity recorded at rest – while a person is relaxed with their eyes closed – could predict the rate at which a second language is learned among adults who spoke only one language.

Studying the resting brain

Resting brain activity is thought to reflect the organization of the brain and it has been linked to intelligence, or the general ability used to reason and problem-solve.

We measured brain activity obtained from a “resting state” to predict individual differences in the ability to learn a second language in adulthood.

To do that, we recorded five minutes of eyes-closed resting-state electroencephalography, a method that detects electrical activity in the brain, in young adults. We also collected two hours of paper-and-pencil and computerized tasks.

We then had 19 participants complete eight weeks of French language training using a computer program. This software was developed by the U.S. armed forces with the goal of getting military personnel functionally proficient in a language as quickly as possible.

The software combined reading, listening and speaking practice with game-like virtual reality scenarios. Participants moved through the content in levels organized around different goals, such as being able to communicate with a virtual cab driver by finding out if the driver was available, telling the driver where their bags were and thanking the driver.

Here’s a video demonstration:

Nineteen adult participants (18-31 years of age) completed two 30-minute training sessions per week for a total of 16 sessions. After each training session, we recorded the level that each participant had reached. At the end of the experiment, we used that level information to calculate each individual’s learning rate across the eight-week training.

As expected, there was large variability in the learning rate, with the best learner moving through the program more than twice as quickly as the slowest learner. Our goal was to figure out which (if any) of the measures recorded initially predicted those differences.

A new brain measure for language aptitude

When we correlated our measures with learning rate, we found that patterns of brain activity that have been linked to linguistic processes predicted how easily people could learn a second language.

Patterns of activity over the right side of the brain predicted upwards of 60 percent of the differences in second language learning across individuals. This finding is consistent with previous research showing that the right half of the brain is more frequently used with a second language.

Our results suggest that the majority of the language learning differences between participants could be explained by the way their brain was organized before they even started learning.

Implications for learning a new language

Does this mean that if you, like me, don’t have a “quick second language learning” brain you should forget about learning a second language?

Not quite.

Language learning can depend on many factors.
Child image via www.shutterstock.com

First, it is important to remember that 40 percent of the difference in language learning rate still remains unexplained. Some of this is certainly related to factors like attention and motivation, which are known to be reliable predictors of learning in general, and of second language learning in particular.

Second, we know that people can change their resting-state brain activity. So training may help to shape the brain into a state in which it is more ready to learn. This could be an exciting future research direction.

Second language learning in adulthood is difficult, but the benefits are large for those who, like myself, are motivated by the desire to communicate with others who do not speak their native tongue.

The Conversation

Brianna Yamasaki, Ph.D. Student, University of Washington

This article was originally published on The Conversation. Read the original article.

The Oxford dictionary’s new words are a testament to the fluid beauty of English


Annabelle Lukin

The Oxford English Dictionary – the “OED” to its friends – has announced a 2016 update, consisting of over 1,000 new words and word meanings, along with the revision or expansion of over 2,000 entries.

The revisions are not just new words or phrases, like “glamping”, “air-punching”, “sweary” and “budgie smugglers”. The OED has also revised its entry of “bittem”, an obsolete word over 1000 years old, meaning “the keel or lower part of a ship’s hull”.

Australia’s most famous wearer of budgie smugglers.
Mick Tsikas/AAP

Where did the new words come from? Some are borrowed from other languages, such as “narcocorrido” (a Spanish word for a traditional Mexican ballad recounting the exploits of drug traffickers), “potjie” (from Afrikaans, a three-legged cast iron cooking pot for use over a fire), and “shishito” (from Japanese, a particular kind of chilli used in Asian cooking).

Some additions are deeply revealing of our modern preoccupations – such as the terms “assisted death” and “assisted dying”. This category also includes the word “agender” (without gender), born of a communal reaction to our deeply binary thinking around gender. The OED dates its use first to the year 2000.

The OED has also added new “initialisms”. To its existing list, which included IMF (International Monetary Fund) and IDB (illicit diamond buyer), it has added ICYMI (in case you missed it), IRL (in real life), IDK (I don’t know), and FFS (look that one up if you don’t know it already!)

Many of the new entries are made by combining words. Some of these fit the definition of “compound words”, that is, words formed by joining two together, such as “air-punching”, “bare-knuckle”, “self-identity” and “straight-acting”. Others are just two words put side-by-side, such as “power couple”, “hockey mum”, “test drive” and “star sign”.

The term ‘power couple’ has been blessed by the OED.
Luke MacGregor/Reuters

Clearly some of these terms – “budgie smugglers” for instance – have been around for some time. The OED dates this term to 1998. The source is The Games, the Australian mockumentary television series about the 2000 Olympic Games in Sydney.

But to make it into a TV program like this, the term must have already been an established expression in the Australian lexicon. The only corpus of English comparing usage across various countries, the GlowBe corpus, shows how deeply Australian the term “budgie smugglers” is.

Frequency of expression ‘budgie smugglers’ in the Global Web-Based English Corpus (GlowBe).

The expression “battle of the sexes”, meanwhile, has only just made it into the dictionary. The OED first attests its use right back to 1723.

Then there are the new forms from old stock. For instance, to the verb “exploit,” the OED is adding an adjective (“exploitational”), an adverb (“exploitatively”), and a noun to denote someone who is exploiting someone or something (“exploiter”).

To the verb “to swear” the OED now includes “sweary”, both as noun (a swear word can be called “a sweary”) and adjective (meaning something or someone characterized by a lot of swearing).

Why the wait?

So how do words get into the dictionary? “Lexicographers” – the folk who make dictionaries – add words only when there is evidence of usage over some period of time, and across various contexts of usage. The process for Oxford dictionaries is explained here.

A dictionary can never hold every word of a language. The only estimate I know suggests that well over half the words of English are not recorded by dictionaries. Since this research is based on the Google Books corpus, the data is only from published books in university libraries. We can safely say this figure is very conservative.

Somewhere around 400 million people speak English as a native language. But linguist David Crystal estimates three times as many speak English as an additional language. Thanks to colonization, English is the primary language for countries as diverse as Barbados, Singapore, and Belize.

This latest OED update includes the publication of written and spoken pronunciations for additional English varieties, including those versions spoken in Australia, Canada, the Carribean, Hong Kong, Ireland, New Zealand, the Phillipines, Scotland, Singapore, Malaysia and South Africia. While some of these varieties already had coverage, their presentation has been expanded.

In praise of Singlish

The addition of Hong Kong and Singapore English are entirely new. Speakers of Singapore English, (or “Singlish”) – I count myself as a reasonable speaker of this dialect – will be delighted to see the inclusion of words such as “ang moh” (a light-skinned person of Western origin), “Chinese helicopter” (a derogatory term for a Singaporean whose schooling was conducted in Mandarin Chinese and whose knowledge of English is limited), “killer litter” (objects thrown or falling from high-rise buildings, endangering the people below) and “shiok” (an expression of admiration).

If you think English belongs to Anglos, then you can start by banishing the word “yum cha” from your vocabulary. For a good laugh at Australian English, and the Indian variety, try this series “How to speak Australians”, from the “Dehli Institute of Linguistics”.

By adding the “World Englishes” to the entries on British and American English, the OED has opened a pandora’s box. For instance, read the OED’s explanation for choosing “White South African English” as the model to represent their entries on South African English.

Changes to the OED remind us that a language is not a fixed entity. Not only is English constantly changing, but its boundaries are fluid.

Languages are open and dynamic: open to other dialects and their many and varied users. Therein lies both the power and beauty of language.

The Conversation

Annabelle Lukin, Associate professor

This article was originally published on The Conversation. Read the original article.

How the British military became a champion for language learning


education-military.jpg

Wendy Ayres-Bennett, University of Cambridge

When an army deploys in a foreign country, there are clear advantages if the soldiers are able to speak the local language or dialect. But what if your recruits are no good at other languages? In the UK, where language learning in schools and universities is facing a real crisis, the British army began to see this as a serious problem.

In a new report on the value of languages, my colleagues and I showcased how a new language policy instituted last year within the British Army, was triggered by a growing appreciation of the risks of language shortages for national security.

Following the conflicts in Iraq and Afghanistan, the military sought to implement language skills training as a core competence. Speakers of other languages are encouraged to take examinations to register their language skills, whether they are language learners or speakers of heritage or community languages.

The UK Ministry of Defence’s Defence Centre for Language and Culture also offers training to NATO standards across the four language skills – listening, speaking, reading and writing. Core languages taught are Arabic, Dari, Farsi, French, Russian, Spanish and English as a foreign language. Cultural training that provides regional knowledge and cross-cultural skills is still embryonic, but developing fast.

Cash incentives

There are two reasons why this is working. The change was directed by the vice chief of the defence staff, and therefore had a high-level champion. There are also financial incentives for army personnel to have their linguistic skills recorded, ranging from £360 for a lower-level western European language, to £11,700 for a high level, operationally vital linguist. Currently any army officer must have a basic language skill to be able to command a sub unit.

A British army sergeant visits a school in Helmand, Afghanistan.
Defence Images/flickr.com, CC BY-NC

We should not, of course, overstate the progress made. The numbers of Ministry of Defence linguists for certain languages, including Arabic, are still precariously low and, according to recent statistics, there are no speakers of Ukrainian or Estonian classed at level three or above in the armed forces. But, crucially, the organisational culture has changed and languages are now viewed as an asset.

Too fragmented

The British military’s new approach is a good example of how an institution can change the culture of the way it thinks about languages. It’s also clear that language policy can no longer simply be a matter for the Department for Education: champions for language both within and outside government are vital for issues such as national security.

This is particularly important because of the fragmentation of language learning policy within the UK government, despite an informal cross-Whitehall language focus group.

Experience on the ground illustrates the value of cooperation when it comes to security. For example, in January, the West Midlands Counter Terrorism Unit urgently needed a speaker of a particular language dialect to assist with translating communications in an ongoing investigation. The MOD was approached and was able to source a speaker within another department.

There is a growing body of research demonstrating the cost to business of the UK’s lack of language skills. Much less is known about their value to national security, defence and diplomacy, conflict resolution and social cohesion. Yet language skills have to be seen as an asset, and appreciation is needed across government for their wider value to society and security.

The Conversation

Wendy Ayres-Bennett, Professor of French Philology and Linguistics, University of Cambridge

This article was originally published on The Conversation. Read the original article.

English has taken over academia: but the real culprit is not linguistic


image-20160421-26983-1wl7ai6.jpg

Anna Kristina Hultgren, The Open University and Elizabeth J. Erling, The Open University

Not only is April 23 the anniversary of William Shakespeare’s death, but the UN has chosen it as UN English Language Day in tribute to the Bard.

If growth in the number of speakers is a measure of success, then the English language certainly deserves to be celebrated. Since the end of World War I, it has risen to become the language with the highest number of non-native users in the world and is the most frequently used language among people who don’t share the same language in business, politics and academia.

In universities in countries where English is not the official language, English is increasingly used as a medium of instruction and is often the preferred language for academics in which to publish their research.

In Europe alone, the number of undergraduate and masters programmes fully taught in English grew from 2,389 in 2007 to 8,089 in 2014 – a 239% increase.

In academic publishing, the use of English has a longer history, especially in the sciences. In 1880, only 36% of publications were in English. It had risen to 50% in 1940-50, 75% in 1980 and 91% in 1996, with the numbers for social sciences and humanities slightly lower.

Today, the proportion of academic articles in the Nordic countries which are published in English is between 70% and 95%, and for doctoral dissertations it’s 80% to 90%.

Pros and cons of using English

One frequently cited advantage of publishing in English is that academics can reach a wider audience and also engage in work produced outside of their own language community. This facilitates international collaboration and, at least ideally, strengthens and validates research. In teaching, using English enables the mobility of staff and students and makes it possible for students to study abroad and get input from other cultures. It also helps develop language skills and intercultural awareness.

But some downsides have been identified. In the Nordic countries, for example, the national language councils have expressed concerns at the lack of use of national languages in academia. They’ve argued that this may impoverish these languages, making it impossible to communicate about scientific issues in Swedish, Danish, Finnish, Norwegian and Icelandic. There has also been fears that the quality of education taking place in English is lower because it may be harder to express oneself in a non-native language. And there are concerns about the creation of inequalities between those who speak English well and those who don’t – though this may begin to change.

Research suggests a more nuanced picture. National languages are still being used in academia and are no more threatened here than in other domains. Both teachers and students have been shown to adapt, drawing on strategies and resources that compensate for any perceived loss of learning. The ability to cope with education in a non-native language depends on a number of factors, such as level of English proficiency – which varies significantly across the world.

English built into the system

Some solutions to these problems have focused on devising language policies which are meant to safeguard local languages. For instance, many Nordic universities have adopted a “parallel language policy”, which accords equal status to English and to the national language (or languages, in the case of Finland, which has two official languages, Finnish and Swedish). While such initiatives may serve important symbolic functions, research suggests that they are unlikely to be effective in the long run.

Learning in Oslo – but in what language?
AstridWestvang/flickr.com, CC BY-NC-ND

This is because the underlying causes of these dramatic changes that are happening in academia worldwide are not simply linguistic, but political and economic. A push for competition in higher education has increased the use of research performance indicators and international bench-marking systems that measure universities against each other.

This competitive marketplace means academics are encouraged to publish their articles in high-ranking journals – in effect this means English-language journals. Many ranking lists also measure universities on their degree of internationalisation, which tends to be interpreted rather simplistically as the ratio of international to domestic staff and students. Turning education into a commodity and charging higher tuition fees for overseas students also makes it more appealing for universities to attract international students. This all indirectly leads to a rise in the use of English: a shared language is necessary for such transnational activities to work.

The rise of English in academia is only a symptom of this competition. If the linguistic imbalance is to be redressed, then this must start with confronting the problem of a university system which has elevated competition and performance indicators to its key organising principle, in teaching as well as research.

The Conversation

Anna Kristina Hultgren, Lecturer in English Language and Applied Linguistics, The Open University and Elizabeth J. Erling, Lecturer of English Language Teaching , The Open University

This article was originally published on The Conversation. Read the original article.

Zut alors, Jeremy Paxman! French isn’t a ‘useless’ language


images

Emmanuelle Labeau, Aston University

Presenter Jeremy Paxman recently hailed the victory of English “in the battle of global tongues” in an article for the Financial Times in which he also claimed that French was “useless” and “bad for you”.

Why such a needless attack on Britain’s closest neighbour and favourite enemy – unless he is trying to court controversy to generate interest in his forthcoming documentary on the UK’s relationship with Europe?

First of all, no language is useless: it serves communication between people. While languages may differ in the number of users and the practical and economic advantages attached to their mastery, they bring similar intellectual and developmental benefits. And multilingualism combines and increases all these gains by favouring personal mental agility, widening horizons and potentially contributing to overcoming parochialism.

Let us now focus on Paxman’s attack on French. Are his barbs aimed at the language or at the people? It’s difficult to say as he confuses countries and languages throughout his column. While he concedes that: “France has enhanced civilisation”, he argues that its influence has long gone. It is very true that the rise and fall of a language greatly depends on extra-linguistic factors such as politics and economy – and France’s current situation does little to enhance the international prestige of its mother tongue.

Burnt cream anybody? Some things just sound better in French.
Le Journal des Femmes, CC BY

Does that mean that the French language is doomed? Clearly not, as French is the official language in 29 countries and France only represents between a quarter and a third of French speakers in the world. Perhaps we might stop to consider, en passant, why Paxman’s analysis does not extend to Britain and English. How much has the elevated status of English worldwide got to do with Britain in the 21st century?

Speaking in tongues

Paxman writes that: “English is the language of science, technology, travel, entertainment and sports”. And he’s right – to an extent. As we know and as all academics can testify, there is huge pressure to publish in English (without necessarily achieving the heights of Shakespeare’s language). And our daily life has been turned upside down in the past 30 years or so thanks to the discoveries of Silicon Valley (which wasn’t in Britain the last time I looked).

When travelling, an ability to master at least “pidgin” English comes in very handy – although it didn’t get me anywhere in Beijing in 2005, and I had to revert to speaking French while in Italy. As for entertainment, of course, people are flocking from all around Europe to take part in Britain’s got Talent … but Hollywood may have a role to play on the global entertainment stage as well.

Like Paxman, I would not expect the singer, Johnny Hallyday (who is, in fact, Belgian-born), to be “the future of pop” – he probably deserves a break after his stellar 50-year career. But the rising global fame of the singer Stromae (real name Paul Van Haver) – the son of a Rwandan father and a Flemish mother – seems to show that entertainment through the medium of French may still have a few good years ahead.

History lesson

Paxman also argues that “France never really decolonised” and its continued influence is stifling development in former colonies by imposing French on their higher education systems rather than English which, he says, would be far more useful. The linguistic imperialism of France certainly does not apply to all former colonies, as eloquently illustrated by the disengagement of France in Djibouti, to the dismay of Francophile locals.

The accusation of colonialism against France appears nonetheless a bit ironic from a British citizen. The Commonwealth is an organisation of British former colonies where Britain played an instrumental role. In contrast, Francophonie – the official use of the French language – was adopted in Senegal by poet and politician Leopold Senghor, in Tunisia the decision to adopt French was taken by Habib Bourguiba and in Cambodia by Norodom Sihanouk.

English and French have coexisted and exchanged words and phrases for a millennium and more. From a 2016 perspective, there is no denying that English has become the most widespread and used language of the two, but it may be worth remembering that for the first half of this coexistence, English was the poor relative and has only taken off as the lingua franca (oh, the irony) since the late 18th century.

History teaches us that civilisations and their language soar and collapse – what would Alexander the Great make of Greece’s current situation? With that in mind, Paxman may be well inspired to moderate his triumphalism. How will the global English language fare in five centuries – will it suffer the same fate as Latin? And even if it could be argued that our hyperconnected civilisation may prevent the death of English, there is no dearth of evidence from the blooming field of “global English” studies to show that the language changes as it conquers the world.

Your language has won the latest battle, Mr Paxman – but no more than that. As you say, the future may belong to those who speak English – but above all, it belongs to those who speak English fluently alongside other languages. And, as former US president George Bush discovered, French is very useful if you want to acquire a really sophisticated vocabulary in English.

The Conversation

Emmanuelle Labeau, Senior Lecturer in French Language and Linguistics, Aston University

This article was originally published on The Conversation. Read the original article.

How language drives students’ transition from rural to urban areas


image-20160317-30237-1put0zy

Thelma Kathleen Buchholz Mort, Cape Peninsula University of Technology

Molofo and Bulewani are training as teachers at a university in one of South Africa’s largest cities, Cape Town. Both young men come from rural backgrounds and English is not their first language. Their experiences of moving from a rural area to a city, and of becoming English speakers, offer a fascinating insight into how language development and social transition are intertwined.

There are about 25,720 state schools in South Africa, and 11,252 are designated as rural. These rural schools tend to be poorly resourced – some don’t have proper furniture, let alone enough teachers or textbooks. Most pupils are taught in their mother tongues, not English, and even if they do learn in English they have little chance to practice speaking it at home or outside school. Pupils from schools in such areas tend not to perform as well in their final exams as their urban counterparts.

It was language that set Bulewani and Molofo apart from their classmates. I interviewed them, along with two other teaching students, as part of a research project presented at the 2015 South African Education Research Association Conference. An article based on this research has been submitted to the SA Journal of Education and is under review. The research findings echo results from elsewhere in the world: participants reported that “leaving behind” their home languages and their physical homes produced a sense of both loss and gain.

Social distance

It is important that this research used rural areas as a context. Such areas tend to be linguistically, educationally and economically isolated from the rest of a country. The students’ experiences are about more than just geographic distance between their rural homes and the city where they study – they’re about social distance, too.

US educationalist John H Schumann talks about this idea of social distance in his research, explaining it as the distance between two language groups in second-language acquisition. Social closeness involves being embedded in a culture. The more culturally comfortable one is, the less the social distance and the easier it is to learn the relevant new language.

Lives in transition

Molofo and Bulewani come from areas where they weren’t surrounded by English speakers. In some rural schools, even the teachers are not particularly proficient in English. Pupils are meant to be taught according to a policy of additive bilingualism – they learn in their mother tongues until Grade 4, and then switch to English as the language of teaching and learning.

This seldom happens, and neither Molofo nor Bulewani learned English this way. They had good English teachers who forced them to speak the language, and both found that they loved it. By the end of their school careers, the young men spoke English well enough to pass it and qualify for university entrance. They also spoke it well enough and had performed well enough at school to earn bursaries. Without this financial support, they would not have been able to take up their university places.

There were two transitory moments at play for Molofo and Bulewani. One involved a physical movement from a rural to an urban area. The other was a transition from functioning in their home or mother tongue to primarily speaking English. Both transitions were facilitated by their acceptance to university. The move came at a cost, though. One of the questions posed in the research was whether students felt that their culture had changed or was under threat because they had learned English. Both said they were losing tradition – but that this wasn’t necessarily a bad thing.

Molofo comes from the Eastern Cape province and grew up in an area governed by a chief. Such areas operate under traditional law. Constitutional democracy, with its notions of guaranteed rights, is remote. He had discovered a greater sense of equality and justice since moving to the city, explaining:

I am not that much interested in a traditional way because there is a lot that I discover that is not fair. Some of the things are not happening in the way they are supposed to. It depends maybe on who you are.

Loss

Bulewani celebrated the fact that he felt more in charge of his own destiny since his “transitions”. But he also experienced profound loss. His family – who are also from the Eastern Cape – lived off the land, and he missed this way of life. He remarked, for instance, that while at home he could go and pick something from the fields, whereas in the city he had to go out and spend money to buy food.

Mostly, though, his feelings of loss revolved around language:

I am losing a lot of words. I miss a lot of words … I am becoming more educated, but I am losing a lot of things in my culture. I am learning a lot of things from Western culture. Talking English. But I am losing a lot of things. I am losing some Xhosa language and traditions.

New voices

Universities need to start collecting more background information about their students to help them settle into this new environment and achieve their goals. For instance, institutions don’t know how many students are from rural areas and might be grappling with the sorts of changes Molofo and Bulewani articulated.

These young men’s voices open an important window on South Africa’s fast-changing society. They are at the forefront of this change, which is both positive and has obvious gains; but is also bittersweet and accompanied by a sense of loss.

The Conversation

Thelma Kathleen Buchholz Mort, PhD student with the Centre for International Teacher Education, Cape Peninsula University of Technology

This article was originally published on The Conversation. Read the original article.