How learning a new language improves tolerance


Image 20161208 31364 1yz4g47
Why learn a new language?
Timothy Vollmer, CC BY

Amy Thompson, University of South Florida

There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.

Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.

Unfortunately, not all American universities consider learning foreign languages a worthwhile investment.

Why is foreign language study important at the university level?

As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.

This happens in two important ways.

The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”

The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”

Gaining cross-cultural understanding

Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.

Psychologist Robert Sternberg’s research on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.

Learning a foreign language reduces social anxiety.
COD Newsroom, CC BY

Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.

Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”

With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.

Dealing with the unknown

The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”

Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.

It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.

Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.

What changes with this understanding

A high tolerance of ambiguity brings many advantages. It helps students become less anxious in social interactions and in subsequent language learning experiences. Not surprisingly, the more experience a person has with language learning, the more comfortable the person gets with this ambiguity.

And that’s not all.

Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).

In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.

Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.

Language learning in higher ed

Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.

Why more universities should teach a foreign language.
sarspri, CC BY-NC

In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.

I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.

Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,

“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”

The ConversationConsidering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”

Amy Thompson, Associate Professor of Applied Linguistics, University of South Florida

This article was originally published on The Conversation. Read the original article.

 

Language puts ordinary people at a disadvantage in the criminal justice system


File 20170817 13465 1lhwsd6
‘Now, did you understand all that?’
Shutterstock

David Wright, Nottingham Trent University

Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.

Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.

But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.

The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.

It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:

You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.

This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.

What the research says

Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.

This issue of comprehensibility is compounded, of course, when the detainee is not a native speaker of English.

The word of the law.
Shutterstock

The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.

However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.

Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.

The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.

Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.

The ConversationLanguage will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.

David Wright, Lecturer in Linguistics, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

The world’s words of the year pass judgement on a dark, surreal 2016


efef.JPG

Philip Seargeant, The Open University

Every December, lexicographers around the world choose their “words of the year”, and this year, perhaps more than ever, the stories these tell provide a fascinating insight into how we’ve experienced the drama and trauma of the last 12 months.

There was much potential in 2016. It was 500 years ago that Thomas More wrote his Utopia, and January saw the launch of a year’s celebrations under the slogan “A Year of Imagination and Possibility” – but as 2017 looms, this slogan rings hollow. Instead of utopian dreams, we’ve had a year of “post-truth” and “paranoia”, of “refugee” crises, “xenophobia” and a close shave with “fascism”.

Earlier in the year, a campaign was launched to have “Essex Girl” removed from the Oxford English Dictionary (OED). Those behind the campaign were upset at the derogatory definition – a young woman “characterised as unintelligent, promiscuous, and materialistic” – so wanted it to be expunged from the official record of the language.

The OED turned down the request, a spokeswoman explaining that since the OED is a historical dictionary, nothing is ever removed; its purpose, she said, is to describe the language as people use it, and to stand as a catalogue of the trends and preoccupations of the time.

The words of the year tradition began with the German Wort des Jahres in the 1970s. It has since spread to other languages, and become increasingly popular the world over. Those in charge of the choices are getting more innovative: in 2015, for the first time, Oxford Dictionaries chose a pictograph as their “word”: the emoji for “Face with Tears of Joy”.

In 2016, however, the verbal was very much back in fashion. The results speak volumes.

Dark days

In English, there are a range of competing words, with all the major dictionaries making their own choices. Having heralded a post-language era last year, Oxford Dictionaries decided on “post-truth” this time, defining it as the situation when “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In a year of evidence-light Brexit promises and Donald Trump’s persistent lies and obfuscations, this has a definite resonance. In the same dystopian vein, the Cambridge Dictionary chose “paranoid”, while Dictionary.com went for “xenophobia”.

Merriam-Webster valiantly tried to turn back the tide of pessimism. When “fascism” looked set to win its online poll, it tweeted its readers imploring them to get behind something – anything – else. The plea apparently worked, and in the end “surreal” won the day. Apt enough for a year in which events time and again almost defied belief.

The referendum that spawned a thousand words.
EPA/Andy Rain

Collins, meanwhile, chose “Brexit”, a term which its spokesperson suggested has become as flexible and influential in political discourse as “Watergate”.

Just as the latter spawned hundreds of portmanteau words whenever a political scandal broke, so Brexit begat “Bremain”, “Bremorse” and “Brexperts” – and will likely be adapted for other upcoming political rifts for many years to come. It nearly won out in Australia in fact, where “Ausexit” (severing ties with the British monarchy or the United Nations) was on the shortlist. Instead, the Australian National Dictionary went for “democracy sausage” – the tradition of eating a barbecued sausage on election day.

Around the world, a similar pattern of politics and apprehension emerges. In France, the mot de l’année was réfugiés (refugees); and in Germany postfaktisch, meaning much the same as “post-truth”. Swiss German speakers, meanwhile, went for Filterblase (filter bubble), the idea that social media is creating increasingly polarised political communities.

Switzerland’s Deaf Association, meanwhile, chose a Sign of the Year for the first time. Its choice was “Trump”, consisting of a gesture made by placing an open palm on the top of the head, mimicking the president-elect’s extravagant hairstyle.

2016’s golden boy, as far as Japan’s concerned.
Albert H. Teich

Trump’s hair also featured in Japan’s choice for this year. Rather than a word, Japan chooses a kanji (Chinese character); 2016’s choice is “金” (gold). This represented a number of different topical issues: Japan’s haul of medals at the Rio Olympics, fluctuating interest rates, the gold shirt worn by singer and YouTube sensation Piko Taro, and, inevitably, the colour of Trump’s hair.

And then there’s Austria, whose word is 51 letters long: Bundespräsidentenstichwahlwiederholungsverschiebung. It means “the repeated postponement of the runoff vote for Federal President”. Referring to the seven months of votes, legal challenges and delays over the country’s presidential election, this again references an event that flirted with extreme nationalism and exposed the convoluted nature of democracy. As a new coinage, it also illustrates language’s endless ability to creatively grapple with unfolding events.

Which brings us, finally, to “unpresidented”, a neologism Donald Trump inadvertently created when trying to spell “unprecedented” in a tweet attacking the Chinese. At the moment, it’s a word in search of a meaning, but the possibilities it suggests seem to speak perfectly to the history of the present moment. And depending on what competitors 2017 throws up, it could well emerge as a future candidate.

The Conversation

Philip Seargeant, Senior Lecturer in Applied Linguistics, The Open University

This article was originally published on The Conversation. Read the original article.

The shelf-life of slang – what will happen to those ‘democracy sausages’?


sausage

Kate Burridge, Monash University

Every year around this time, dictionaries across the English-speaking world announce their “Word of the Year”. These are expressions (some newly minted and some golden oldies too) that for some reason have shot into prominence during the year.

Earlier this month The Australian National Dictionary Centre declared its winner “democracy sausage” – the barbecued snag that on election day makes compulsory voting so much easier to swallow.

Dictionaries make their selections in different ways, but usually it involves a combination of suggestions from the public and the editorial team (who have been meticulously tracking these words throughout the year). The Macquarie Dictionary has two selections – the Committee’s Choice made by the Word of the Year Committee, and the People’s Choice made by the public (so make sure you have your say on January 24 for the People’s Choice winner 2016).

It’s probably not surprising that these words of note draw overwhelmingly from slang language, or “slanguage” – a fall-out of the increasing colloquialisation of English usage worldwide. In Australia this love affair with the vernacular goes back to the earliest settlements of English speakers.

And now there’s the internet, especially social networking – a particularly fertile breeding ground for slang.

People enjoy playing with language, and when communicating electronically they have free rein. “Twitterholic”, “twaddiction”, “celebritweet/twit”, “twitterati” are just some of the “tweologisms” that Twitter has spawned of late. And with a reported average of 500 million tweets each day, Twitter has considerable capacity not only to create new expressions, but to spread them (as do Facebook, Instagram and other social networking platforms).

But what happens when slang terms like these make it into the dictionary? Early dictionaries give us a clue, particularly the entries that are stamped unfit for general use. Branded entries were certainly plentiful in Samuel Johnson’s 18th-century work, and many are now wholly respectable: abominably “a word of low or familiar language”, nowadays “barbarous usage”, fun “a low cant word” (what would Johnson have thought of very fun and funner?).

Since the point of slang is to mark an in-group, to amuse and perhaps even to shock outsiders with novelty, most slang expressions are short-lived. Those that survive become part of the mainstream and mundane. Quite simply, time drains them of their vibrancy and energy. J.M. Wattie put it more poetically back in 1930:

Slang terms are the mayflies of language; by the time they get themselves recorded in a dictionary, they are already museum specimens.

But, then again, expressions occasionally do sneak through the net. Not only do they survive, they stay slangy – and sometimes over centuries. Judge for yourselves. Here are some entries from A New and Comprehensive Vocabulary of the Flash Language. Written by British convict James Hardy Vaux in 1812, this is the first dictionary compiled in Australia.

croak “to die”

grub “food”

kid “deceive”

mug “face”

nuts on “to have a strong inclination towards something or someone”

on the sly “secretly”

racket “particular kind of fraud”

snitch “to betray”

stink “an uproar”

spin a yarn “tell a tale of great adventure”

These were originally terms of flash – or, as Vaux put it, “the cant language used by the family”. In other words, they belonged to underworld slang. The term slang itself meant something similar at this time; it broadened to highly colloquial language in the 1800s.

Vaux went on to point out that “to speak good flash is to be well versed in cant terms” — and, having been transported to New South Wales on three separate occasions during his “checkered and eventful life” (his words), Vaux himself was clearly well versed in the world of villainy and cant.

True, the majority of the slang terms here have dropped by the wayside (barnacles “spectacles”; lush “to drink”), and the handful that survives are now quite standard (grab “to seize”; dollop “large quantity”). But there are a few that have not only lasted, they’ve remained remarkably contemporary-sounding – some still even a little “disgraceful” (as Vaux described them).

The shelf-life of slang is a bit of mystery. Certainly some areas fray faster than others. Vaux’s prime, plummy and rum (meaning “excellent”) have well and truly bitten the dust. Cool might have made a comeback (also from the 1800s), but intensifiers generally wear out.

Far out and ace have been replaced by awesome, and there are plenty of new “awesome” words lurking in the wings. Some of these are already appearing on lists for “Most Irritating Word of the Year” – it’s almost as if their success does them in. Amazeballs, awesomesauce and phat are among the walking dead.

But as long as sausage sizzles continue to support Australian voters on election day, democracy sausages will have a place – and if adopted elsewhere, might even entice the politically uninterested into polling booths.

The Conversation

Kate Burridge, Professor of Linguistics, Monash University

This article was originally published on The Conversation. Read the original article.

How training can prepare teachers for diversity in their classrooms


capture

Maureen Robinson, Stellenbosch University

Teachers have been shaping lives for centuries. Everyone remembers their favourite (and of course their least favourite) teachers. This important group of people even has its own special day, marked each October by the United Nations.

Teachers are at the coal face when it comes to watching societies change. South Africa’s classrooms, for instance, look vastly different today than they did two decades ago. They bring together children from different racial, cultural, economic and social backgrounds. This can sometimes cause conflict as varied ways of understanding the world bump up against each other.

How can teachers develop the skills to work with these differences in productive ways? What practical support do they need to bring the values of the Constitution to life in their classes?

To answer these questions, my colleagues and I in the Faculty of Education at Stellenbosch University have put together four examples from modules within our faculty’s teacher education programme. These ideas are by no means exhaustive; other institutions also tackle these issues. What we present here is based on our own research, teaching and experience and is open to further discussion.

1. Working with multilingualism

English is only South Africa’s fifth most spoken home language. Teachers must remember this: even if their pupils are speaking English in the classroom, their home languages may be far more diverse.

Trainee teachers can benefit enormously from a course on multilingual education. In our faculty, for instance, students are given the chance to place multilingual education in a South African policy framework. They model multilingual classroom strategies like code switching and translation. They visit schools to observe how such strategies are applied in the real classroom. Students then report back on whether this approach helps learners from different language backgrounds to participate actively in the lesson.

There’s also great value in introducing student teachers to the notion of “World Englishes”. This focuses on the role of English in multilingual communities, where it is seen as being used for communication and academic purposes rather than as a way for someone to be integrated into an English community.

2. Supporting diverse learning needs

Student teachers must be trained to identify and support pupils’ diverse learning needs. This helps teachers to identify and address barriers to learning and development and encourages linkages between the home and the school.

This is even more meaningful when it is embedded in experiential learning. For instance, in guided exercises with their own class groups, our students engage with their feelings, experiences and thinking about their own backgrounds and identities. Other activities may be based on real scenarios, such as discussing the case of a boy who was sanctioned by his school for wearing his hair in a way prescribed by his religion.

In these modules we focus on language, culture, race, socioeconomic conditions, disability, sexual orientation, learning differences and behavioural, health or emotional difficulties. The students also learn how to help vulnerable learners who are being bullied.

And these areas are constantly expanding. At Stellenbosch University, we’ve recently noted that we need to prepare teachers to deal with the bullying of LGBT learners. They also need to be equipped with the tools to support pupils who’ve immigrated from elsewhere in Africa.

3. Advancing a democratic classroom

Courses that deal with the philosophy of education are an important element of teacher education. These explore notions of diversity, human dignity, social justice and democratic citizenship.

In these classes, student teachers are encouraged to see their own lecture rooms as spaces for open and equal engagement, with regard and respect for different ways of being. They’re given opportunities to express and engage with controversial views. This stands them in good stead to create such spaces in their own classrooms.

Most importantly, students are invited to critically reconsider commonly held beliefs – and to disrupt their ideas of the world – so that they might encounter the other as they are and not as they desire them to be. In such a classroom, a teacher promotes discussion and debate. She cultivates respect and regard for the other by listening to different accounts and perspectives. Ultimately, the teacher accepts that she is just one voice in the classroom.

4. Understanding constitutional rights in the classroom

All the approaches to teacher education described here are underpinned by the Constitution.

The idea is that teacher education programmes should develop teachers who understand notions of justice, citizenship and social cohesion. Any good teacher needs to be able to reflect critically on their own role as leader and manager within the contexts of classrooms, schools and the broader society. This includes promoting values of democracy, social justice and equality, and building attitudes of respect and reciprocity.

A critical reflective ethos is encouraged. Students get numerous opportunities to interrogate, debate, research, express and reflect upon educational challenges, theories and policies, from different perspectives, as these apply to practice. This is all aimed at building a positive school environment for everyone.

Moving into teaching

What about when students become teachers themselves?

For many new teachers these inclusive practices are not easy to implement in schools. One lecturer in our faculty has been approached by former students who report that as beginner teachers, they don’t have “the status or voice to change existing discriminatory practices and what some experience as the resistance to inclusive education”. This suggests that ongoing discussion and training in both pre-service and in-service education is needed.

At the same time, however, there are signs that these modules are having a positive impact. Students post comments and ideas on social media and lecturers regularly hear from first-time teachers about how useful their acquired knowledge is in different contexts. Many are also eager to study further so they can explore the issues more deeply.

Everything I’ve described here is part of one faculty’s attempts to provide safe spaces where student teachers can learn to work constructively with the issues pertaining to diversity in education. In doing so, we hope they’ll become part of building a country based on respect for all.

Author’s note: I am grateful to my colleagues Lynette Collair, Nuraan Davids, Jerome Joorst and Christa van der Walt for the ideas contained in this article.

The Conversation

Maureen Robinson, Dean, Faculty of Education, Stellenbosch University

This article was originally published on The Conversation. Read the original article.

Clear skies ahead: how improving the language of aviation could save lives


article-0-1A519FCE000005DC-280_964x629 (1).jpg

Dominique Estival, Western Sydney University

The most dangerous part of flying is driving to the airport.

That’s a standard joke among pilots, who know even better than the flying public that aviation is the safest mode of transportation.

But there are still those headlines and TV shows about airline crashes, and those statistics people like to repeat, such as:

Between 1976 and 2000, more than 1,100 passengers and crew lost their lives in accidents in which investigators determined that language had played a contributory role.

True enough, 80% of all air incidents and accidents occur because of human error. Miscommunication combined with other human factors such as fatigue, cognitive workload, noise, or forgetfulness have played a role in some of the deadliest accidents.

The most well-known, and widely discussed, is the collision on the ground of two Boeing 747 aircraft in 1977 in Tenerife, which resulted in 583 fatalities. The incident was due in part to difficult communications between the pilot, whose native language was Dutch, and the Spanish air traffic controller.

In such a high-stakes environment as commercial aviation, where the lives of hundreds of passengers and innocent people on the ground are involved, communication is critical to safety.

So, it was decided that Aviation English would be the international language of aviation and that all aviation professionals – pilots and air traffic controllers (ATC) – would need to be proficient in it. It is a language designed to minimise ambiguities and misunderstandings, highly structured and codified.

Pilots and ATC expect to hear certain bits of information in certain ways and in a given order. The “phraseology”, with its particular pronunciation (for example, “fife” and “niner” instead of “five” and “nine”, so they’re not confused with each other), specific words (“Cleared to land”), international alphabet (“Mike Hotel Foxtrot”) and strict conversation rules (you must repeat, or “read back”, an instruction), needs to be learned and practised.

In spite of globalisation and the spread of English, most people around the world are not native English speakers, and an increasing number of aviation professionals do not speak English as their first language.

Native speakers have an advantage when they learn Aviation English, since they already speak English at home and in their daily lives. But they encounter many pilots or ATC who learned English as a second or even third language.

Whose responsibility is it to ensure that communication is successful? Can native speakers simply speak the way they do at home and expect to be understood? Or do they also have the responsibility to make themselves understood and to learn how to understand pilots or ATC who are not native English speakers?

As a linguist, I analyse aviation language from a linguistics perspective. I have noted the restricted meaning of the few verbs and adjectives; that the only pronouns are “you” and sometimes “we” (“How do you read?”; “We’re overhead Camden”; how few questions there are, mostly imperatives (“Maintain heading 180”); and that the syntax is so simple (no complement clauses, no relative clauses, no recursion), it might not even count as a human language for Chomsky.

But, as a pilot and a flight instructor, I look at it from the point of view of student pilots learning to use it in the cockpit while also learning to fly the airplane and navigate around the airfield.

How much harder it is to remember what to say when the workload goes up, and more difficult to speak over the radio when you know everyone else on the frequency is listening and will notice every little mistake you make?

Imagine, then, how much more difficult this is for pilots with English as a second language.

Camden Airport.
Supplied

Everyone learning another language knows it’s suddenly more challenging to hold a conversation over the phone than face-to-face, even with someone you already know. When it’s over the radio, with someone you don’t know, against the noise of the engine, static noise in the headphones, and while trying to make the plane do what you want it to do, it can be quite daunting.

No wonder student pilots who are not native English speakers sometimes prefer to stay silent, and even some experienced native English speakers will too, when the workload is too great.

This is one of the results of my research conducted in collaboration with UNSW’s Brett Molesworth, combining linguistics and aviation human factors.

Experiments in a flight simulator with pilots of diverse language backgrounds and flying experience explored conditions likely to result in pilots making mistakes or misunderstanding ATC instructions. Not surprisingly, increased workload, too much information, and rapid ATC speech, caused mistakes.

Also not surprisingly, less experienced pilots, no matter their English proficiency, made more mistakes. But surprisingly, it was the level of training, rather than number of flying hours or language background, that predicted better communication.

Once we understand the factors contributing to miscommunication in aviation, we can propose solutions to prevent them. For example, technologies such as Automatic Speech Recognition and Natural Language Understanding may help catch errors in pilot readbacks that ATC did not notice and might complement training for pilots and ATC.

It is vital that they understand each other, whatever their native language.

The Conversation

Dominique Estival, Researcher in Linguistics, Western Sydney University

This article was originally published on The Conversation. Read the original article.

Beware the bad big wolf: why you need to put your adjectives in the right order


image-20160906-25260-dcj9cp.jpg

Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

 

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

Rules, rules, rules

Nevertheless, however many adjectives we choose to employ, they do indeed tend to follow a predictable pattern. While native speakers intuitively follow this rule, most are unaware that they are doing so; we agree that the “red big dog” sounds wrong, but don’t know why. In order to test this intuition linguists have analysed large corpora of electronic data, to see how frequently pairs of adjectives like “big red” are preferred to “red big”. The results confirm our native intuition, although the figures are not as comprehensive as we might expect – the rule accounts for 78% of the data.

We know how to use them … without even being aware of it.
Shutterstock

But while linguists have been able to confirm that there are strong preferences in the ordering of pairs of adjectives, no such statistics have been produced for longer strings. Consequently, while Forsyth’s rule appears to make sense, it remains an untested, hypothetical, large, sweeping (sorry) claim.

In fact, even if we stick to just two adjectives it is possible to find examples that appear to break the rule. The “big bad wolf” of fairy tale, for instance, shows the size adjective preceding the opinion one; similarly, “big stupid” is more common than “stupid big”. Examples like these are instead witness to the “Polyanna Principle”, by which speakers prefer to present positive, or indifferent, values before negative ones.

Another consideration of Forsyth’s proposed ordering sequence is that it makes no reference to other constraints that influence adjective order, such as when we use two adjectives that fall into the same category. Little Richard’s song “Long Tall Sally” would have sounded strange if he had called it Tall Long Sally, but these are both adjectives of size.

Definitely not Tall Long Sally.

Similarly, we might describe a meal as “nice and spicy” but never “spicy and nice” – reflecting a preference for the placement of general opinions before more specific ones. We also need to bear in mind the tendency for noun phrases to become lexicalised – forming words in their own right. Just as a blackbird is not any kind of bird that is black, a little black dress does not refer to any small black dress but one that is suitable for particular kinds of social engagement.

Since speakers view a “little black dress” as a single entity, its order is fixed; as a result, modifying adjectives must precede little – a “polyester little black dress”. This means that an adjective specifying its material appears before those referring to size and colour, once again contravening Forsyth’s rule.

Making sense of language

Of course, the rule is a fair reflection of much general usage – although the reasons behind this complex set of constraints in adjective order remain disputed. Some linguists have suggested that it reflects the “nouniness” of an adjective; since colour adjectives are commonly used as nouns – “red is my favourite colour” – they appear close to that slot.

Another conditioning factor may be the degree to which an adjective reflects a subjective opinion rather than an objective description – therefore, subjective adjectives that are harder to quantify (boring, massive, middle-aged) tend to appear further away from the noun than more concrete ones (red, round, French).

Prosody, the rhythm and sound of poetry, is likely to play a role, too – as there is a tendency for speakers to place longer adjectives after shorter ones. But probably the most compelling theory links adjective position with semantic closeness to the noun being described; adjectives that are closely related to the noun in meaning, and are therefore likely to appear frequently in combination with it, are placed closest, while those that are less closely related appear further away.

In Forsyth’s example, it is the knife’s whittling capabilities that are most significant – distinguishing it from a carving, fruit or butter knife – while its loveliness is hardest to define (what are the standards for judging the loveliness of a whittling knife?) and thus most subjective. Whether any slight reorganisation of the other adjectives would really prompt your friends to view you as a knife-wielding maniac is harder to determine – but then, at least it’s just a whittling knife.

The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

How the Queen’s English has had to defer to Africa’s rich multilingualism


Rajend Mesthrie, University of Cape Town

For the first time in history a truly global language has emerged. English enables international communication par excellence, with a far wider reach than other possible candidates for this position – like Latin in the past, and French, Spanish and Mandarin in the present.

In a memorable phrase, former Tanzanian statesman Julius Nyerere once characterised English as the Kiswahili of the world. In Africa, English is more widely spoken than other important lingua francas like Kiswahili, Arabic, French and Portuguese, with at least 26 countries using English as one of their official languages.

But English in Africa comes in many different shapes and forms. It has taken root in an exceptionally multilingual context, with well over a thousand languages spoken on the continent. The influence of this multilingualism tends to be largely erased at the most formal levels of use – for example, in the national media and in higher educational contexts. But at an everyday level, the Queen’s English has had to defer to the continent’s rich abundance of languages. Pidgin, creole, second-language and first-language English all flourish alongside them.

The birth of new languages

English did not enter Africa as an innocent language. Its history is tied up with trade and exploitation, capitalist expansion, slavery and colonisation.

The history of English is tied up with trade, capitalist expansion, slavery and colonialism.
Shutterstock

As the need for communication arose and increased under these circumstances, forms of English, known as pidgins and creoles, developed. This took place within a context of unequal encounters, a lack of sustained contact with speakers of English and an absence of formal education. Under these conditions, English words were learnt and attached to an emerging grammar that owed more to African languages than to English.

A pidgin is defined by linguists as an initially simple form of communication that arises from contact between speakers of disparate languages who have
no other means of communication in common. Pidgins, therefore, do not have mother-tongue speakers. The existence of pidgins in the early period of West African-European contact is not well documented, and some linguists like Salikoko Mufwene judge their early significance to be overestimated.

Pidgins can become more complex if they take on new functions. They are relabelled creoles if, over time and under specific circumstances, they become fully developed as the first language of a group of speakers.

Ultimately, pidgins and creoles develop grammatical norms that are far removed from the colonial forms that partially spawned them: to a British English speaker listening to a pidgin or creole, the words may seem familiar in form, but not always in meaning.

Linguists pay particular attention to these languages because they afford them the opportunity to observe creativity at first hand: the birth of new languages.

The creoles of West Africa

West Africa’s creoles are of two types: those that developed outside Africa; and those that first developed from within the continent.

The West African creoles that developed outside Africa emerged out of the multilingual and oppressive slave experience in the New World. They were then brought to West Africa after 1787 by freed slaves repatriated from Britain, North America and the Caribbean. “Krio” was the name given to the English-based creole of slaves freed from Britain who were returned to Sierra Leone, where they were joined by slaves released from Nova Scotia and Jamaica.

Some years after that, in 1821, Liberia was established as an African homeland for freed slaves from the US. These men and women brought with them what some linguists call “Liberian settler English”. This particular creole continues to make Liberia somewhat special on the continent, with American rather than British forms of English dominating there.

These languages from the New World were very influential in their new environments, especially over the developing West African pidgin English.

A more recent, homegrown type of West African creole has emerged in the region. This West African creole is spreading in the context of urban multilingualism and changing youth identities. Over the past 50 years, it has grown spectacularly in Ghana, Cameroon, Equatorial Guinea and Sierra Leone, and it is believed to be the fastest-growing language in Nigeria. In this process pidgin English has been expanded into a creole, used as one of the languages of the home. For such speakers, the designation “pidgin” is now a misnomer, although it remains widely used.

In East Africa, in contrast, the strength and historicity of Kiswahili as a lingua franca prevented the rapid development of pidgins based on colonial languages. There, traders and colonists had to learn Kiswahili for successful everyday communication. This gave locals more time to master English as a fully-fledged second language.

Other varieties of English

Africa, mirroring the trend in the rest of the world, has a large and increasing number of second-language English speakers. Second-language varieties of English are mutually intelligible with first-language versions, while showing varying degrees of difference in accent, grammar and nuance of vocabulary. Formal colonisation and the educational system from the 19th century onwards account for the wide spread of second-language English.

What about first-language varieties of English on the continent? The South African variety looms large in this history, showing similarities with English in Australia and New Zealand, especially in details of accent.

In post-apartheid South Africa many young black people from middle-class backgrounds now speak this variety either as a dominant language or as a “second first-language”. But for most South Africans English is a second language – a very important one for education, business and international communication.

For family and cultural matters, African languages remain of inestimable value throughout the continent.

The Conversation

Rajend Mesthrie, Professor of Linguistics, University of Cape Town

This article was originally published on The Conversation. Read the original article.

How the British military became a champion for language learning


education-military.jpg

Wendy Ayres-Bennett, University of Cambridge

When an army deploys in a foreign country, there are clear advantages if the soldiers are able to speak the local language or dialect. But what if your recruits are no good at other languages? In the UK, where language learning in schools and universities is facing a real crisis, the British army began to see this as a serious problem.

In a new report on the value of languages, my colleagues and I showcased how a new language policy instituted last year within the British Army, was triggered by a growing appreciation of the risks of language shortages for national security.

Following the conflicts in Iraq and Afghanistan, the military sought to implement language skills training as a core competence. Speakers of other languages are encouraged to take examinations to register their language skills, whether they are language learners or speakers of heritage or community languages.

The UK Ministry of Defence’s Defence Centre for Language and Culture also offers training to NATO standards across the four language skills – listening, speaking, reading and writing. Core languages taught are Arabic, Dari, Farsi, French, Russian, Spanish and English as a foreign language. Cultural training that provides regional knowledge and cross-cultural skills is still embryonic, but developing fast.

Cash incentives

There are two reasons why this is working. The change was directed by the vice chief of the defence staff, and therefore had a high-level champion. There are also financial incentives for army personnel to have their linguistic skills recorded, ranging from £360 for a lower-level western European language, to £11,700 for a high level, operationally vital linguist. Currently any army officer must have a basic language skill to be able to command a sub unit.

A British army sergeant visits a school in Helmand, Afghanistan.
Defence Images/flickr.com, CC BY-NC

We should not, of course, overstate the progress made. The numbers of Ministry of Defence linguists for certain languages, including Arabic, are still precariously low and, according to recent statistics, there are no speakers of Ukrainian or Estonian classed at level three or above in the armed forces. But, crucially, the organisational culture has changed and languages are now viewed as an asset.

Too fragmented

The British military’s new approach is a good example of how an institution can change the culture of the way it thinks about languages. It’s also clear that language policy can no longer simply be a matter for the Department for Education: champions for language both within and outside government are vital for issues such as national security.

This is particularly important because of the fragmentation of language learning policy within the UK government, despite an informal cross-Whitehall language focus group.

Experience on the ground illustrates the value of cooperation when it comes to security. For example, in January, the West Midlands Counter Terrorism Unit urgently needed a speaker of a particular language dialect to assist with translating communications in an ongoing investigation. The MOD was approached and was able to source a speaker within another department.

There is a growing body of research demonstrating the cost to business of the UK’s lack of language skills. Much less is known about their value to national security, defence and diplomacy, conflict resolution and social cohesion. Yet language skills have to be seen as an asset, and appreciation is needed across government for their wider value to society and security.

The Conversation

Wendy Ayres-Bennett, Professor of French Philology and Linguistics, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Britain may be leaving the EU, but English is going nowhere


image-20160701-18331-1oy1oep

Andrew Linn, University of Westminster

After Brexit, there are various things that some in the EU hope to see and hear less in the future. One is Nigel Farage. Another is the English language.

In the early hours of June 24, as the referendum outcome was becoming clear, Jean-Luc Mélenchon, left-wing MEP and French presidential candidate, tweeted that “English cannot be the third working language of the European parliament”.

This is not the first time that French and German opinion has weighed in against alleged disproportionate use of English in EU business. In 2012, for example, a similar point was made about key eurozone recommendations from the European Commission being published initially “in a language which [as far as the Euro goes] is only spoken by less than 5m Irish”. With the number of native speakers of English in the EU set to drop from 14% to around 1% of the bloc’s total with the departure of the UK, this point just got a bit sharper.

Translation overload

Official EU language policy is multilingualism with equal rights for all languages used in member states. It recommends that “every European citizen should master two other languages in addition to their mother tongue” – Britain’s abject failure to achieve this should make it skulk away in shame.

The EU recognises 24 “official and working” languages, a number that has mushroomed from the original four (Dutch, French, German and Italian) as more countries have joined. All EU citizens have a right to access EU documents in any of those languages. This calls for a translation team numbering around 2,500, not to mention a further 600 full-time interpreters. In practice most day-to-day business is transacted in either English, French or German and then translated, but it is true that English dominates to a considerable extent.

Lots of work still to do.
Etienne Ansotte/EPA

The preponderance of English has nothing to do with the influence of Britain or even Britain’s membership of the EU. Historically, the expansion of the British empire, the impact of the industrial revolution and the emergence of the US as a world power have embedded English in the language repertoire of speakers across the globe.

Unlike Latin, which outlived the Roman empire as the lingua franca of medieval and renaissance Europe, English of course has native speakers (who may be unfairly advantaged), but it is those who have learned English as a foreign language – “Euro-English” or “English as a lingua franca” – who now constitute the majority of users.

According to the 2012 Special Eurobarometer on Europeans and their Languages, English is the most widely spoken foreign language in 19 of the member states where it is not an official language. Across Europe, 38% of people speak English well enough as a foreign language to have a conversation, compared to 12% speaking French and 11% in German.

The report also found that 67% of Europeans consider English the most useful foreign language, and that the numbers favouring German (17%) or French (16%) have declined. As a result, 79% of Europeans want their children to learn English, compared to 20% for French and German.

Too much invested in English

Huge sums have been invested in English teaching by both national governments and private enterprise. As the demand for learning English has increased, so has the supply. English language learning worldwide was estimated to be worth US$63.3 billion (£47.5 billion) in 2012, and it is expected that this market will rise to US$193.2 billion (£145.6 billion) by 2017. The value of English for speakers of other languages is not going to diminish any time soon. There is simply too much invested in it.

Speakers of English as a second language outnumber first-language English speakers by 2:1 both in Europe and globally. For many Europeans, and especially those employed in the EU, English is a useful piece in a toolbox of languages to be pressed into service when needed – a point which was evident in a recent project on whether the use of English in Europe was an opportunity or a threat. So in the majority of cases using English has precisely nothing to do with the UK or Britishness. The EU needs practical solutions and English provides one.

English is unchallenged as the lingua franca of Europe. It has even been suggested that in some countries of northern Europe it has become a second rather than a foreign language. Jan Paternotte, D66 party leader in Amsterdam, has proposed that English should be decreed the official second language of that city.

English has not always held its current privileged status. French and German have both functioned as common languages for high-profile fields such as philosophy, science and technology, politics and diplomacy, not to mention Church Slavonic, Russian, Portuguese and other languages in different times and places.

We can assume that English will not maintain its privileged position forever. Who benefits now, however, are not the predominantly monolingual British, but European anglocrats whose multilingualism provides them with a key to international education and employment.

Much about the EU may be about to change, but right now an anti-English language policy so dramatically out of step with practice would simply make the post-Brexit hangover more painful.

The Conversation

Andrew Linn, Pro-Vice-Chancellor and Dean of Social Sciences and Humanities, University of Westminster

This article was originally published on The Conversation. Read the original article.