Language puts ordinary people at a disadvantage in the criminal justice system


File 20170817 13465 1lhwsd6
‘Now, did you understand all that?’
Shutterstock

David Wright, Nottingham Trent University

Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.

Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.

But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.

The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.

It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:

You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.

This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.

What the research says

Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.

This issue of comprehensibility is compounded, of course, when the detainee is not a native speaker of English.

The word of the law.
Shutterstock

The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.

However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.

Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.

The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.

Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.

The ConversationLanguage will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.

David Wright, Lecturer in Linguistics, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

Advertisements

Is there such a thing as a national sense of humour?


File 20170503 21630 1p64l4e
A statue celebrating Monty Python’s sketch The Dead Parrot near London’s Tower Bridge ahead of a live show on the TV channel Gold.
DAVID HOLT/Flickr, CC BY-SA

Gary McKeown, Queen’s University Belfast

We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual? The Conversation

There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.

Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.

Just for laughs.

These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.

Language and culture

Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.

Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.

Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.

International humour?
C.P.Storm/Flickr, CC BY-SA

Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.

Denigration and self-deprecation

There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.

Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.

Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.

Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.

Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.

‘Fork handles’.

For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.

These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.

A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.

Gary McKeown, Senior Lecturer of Psychology, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

The world’s words of the year pass judgement on a dark, surreal 2016


efef.JPG

Philip Seargeant, The Open University

Every December, lexicographers around the world choose their “words of the year”, and this year, perhaps more than ever, the stories these tell provide a fascinating insight into how we’ve experienced the drama and trauma of the last 12 months.

There was much potential in 2016. It was 500 years ago that Thomas More wrote his Utopia, and January saw the launch of a year’s celebrations under the slogan “A Year of Imagination and Possibility” – but as 2017 looms, this slogan rings hollow. Instead of utopian dreams, we’ve had a year of “post-truth” and “paranoia”, of “refugee” crises, “xenophobia” and a close shave with “fascism”.

Earlier in the year, a campaign was launched to have “Essex Girl” removed from the Oxford English Dictionary (OED). Those behind the campaign were upset at the derogatory definition – a young woman “characterised as unintelligent, promiscuous, and materialistic” – so wanted it to be expunged from the official record of the language.

The OED turned down the request, a spokeswoman explaining that since the OED is a historical dictionary, nothing is ever removed; its purpose, she said, is to describe the language as people use it, and to stand as a catalogue of the trends and preoccupations of the time.

The words of the year tradition began with the German Wort des Jahres in the 1970s. It has since spread to other languages, and become increasingly popular the world over. Those in charge of the choices are getting more innovative: in 2015, for the first time, Oxford Dictionaries chose a pictograph as their “word”: the emoji for “Face with Tears of Joy”.

In 2016, however, the verbal was very much back in fashion. The results speak volumes.

Dark days

In English, there are a range of competing words, with all the major dictionaries making their own choices. Having heralded a post-language era last year, Oxford Dictionaries decided on “post-truth” this time, defining it as the situation when “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In a year of evidence-light Brexit promises and Donald Trump’s persistent lies and obfuscations, this has a definite resonance. In the same dystopian vein, the Cambridge Dictionary chose “paranoid”, while Dictionary.com went for “xenophobia”.

Merriam-Webster valiantly tried to turn back the tide of pessimism. When “fascism” looked set to win its online poll, it tweeted its readers imploring them to get behind something – anything – else. The plea apparently worked, and in the end “surreal” won the day. Apt enough for a year in which events time and again almost defied belief.

The referendum that spawned a thousand words.
EPA/Andy Rain

Collins, meanwhile, chose “Brexit”, a term which its spokesperson suggested has become as flexible and influential in political discourse as “Watergate”.

Just as the latter spawned hundreds of portmanteau words whenever a political scandal broke, so Brexit begat “Bremain”, “Bremorse” and “Brexperts” – and will likely be adapted for other upcoming political rifts for many years to come. It nearly won out in Australia in fact, where “Ausexit” (severing ties with the British monarchy or the United Nations) was on the shortlist. Instead, the Australian National Dictionary went for “democracy sausage” – the tradition of eating a barbecued sausage on election day.

Around the world, a similar pattern of politics and apprehension emerges. In France, the mot de l’année was réfugiés (refugees); and in Germany postfaktisch, meaning much the same as “post-truth”. Swiss German speakers, meanwhile, went for Filterblase (filter bubble), the idea that social media is creating increasingly polarised political communities.

Switzerland’s Deaf Association, meanwhile, chose a Sign of the Year for the first time. Its choice was “Trump”, consisting of a gesture made by placing an open palm on the top of the head, mimicking the president-elect’s extravagant hairstyle.

2016’s golden boy, as far as Japan’s concerned.
Albert H. Teich

Trump’s hair also featured in Japan’s choice for this year. Rather than a word, Japan chooses a kanji (Chinese character); 2016’s choice is “金” (gold). This represented a number of different topical issues: Japan’s haul of medals at the Rio Olympics, fluctuating interest rates, the gold shirt worn by singer and YouTube sensation Piko Taro, and, inevitably, the colour of Trump’s hair.

And then there’s Austria, whose word is 51 letters long: Bundespräsidentenstichwahlwiederholungsverschiebung. It means “the repeated postponement of the runoff vote for Federal President”. Referring to the seven months of votes, legal challenges and delays over the country’s presidential election, this again references an event that flirted with extreme nationalism and exposed the convoluted nature of democracy. As a new coinage, it also illustrates language’s endless ability to creatively grapple with unfolding events.

Which brings us, finally, to “unpresidented”, a neologism Donald Trump inadvertently created when trying to spell “unprecedented” in a tweet attacking the Chinese. At the moment, it’s a word in search of a meaning, but the possibilities it suggests seem to speak perfectly to the history of the present moment. And depending on what competitors 2017 throws up, it could well emerge as a future candidate.

The Conversation

Philip Seargeant, Senior Lecturer in Applied Linguistics, The Open University

This article was originally published on The Conversation. Read the original article.

Clear skies ahead: how improving the language of aviation could save lives


article-0-1A519FCE000005DC-280_964x629 (1).jpg

Dominique Estival, Western Sydney University

The most dangerous part of flying is driving to the airport.

That’s a standard joke among pilots, who know even better than the flying public that aviation is the safest mode of transportation.

But there are still those headlines and TV shows about airline crashes, and those statistics people like to repeat, such as:

Between 1976 and 2000, more than 1,100 passengers and crew lost their lives in accidents in which investigators determined that language had played a contributory role.

True enough, 80% of all air incidents and accidents occur because of human error. Miscommunication combined with other human factors such as fatigue, cognitive workload, noise, or forgetfulness have played a role in some of the deadliest accidents.

The most well-known, and widely discussed, is the collision on the ground of two Boeing 747 aircraft in 1977 in Tenerife, which resulted in 583 fatalities. The incident was due in part to difficult communications between the pilot, whose native language was Dutch, and the Spanish air traffic controller.

In such a high-stakes environment as commercial aviation, where the lives of hundreds of passengers and innocent people on the ground are involved, communication is critical to safety.

So, it was decided that Aviation English would be the international language of aviation and that all aviation professionals – pilots and air traffic controllers (ATC) – would need to be proficient in it. It is a language designed to minimise ambiguities and misunderstandings, highly structured and codified.

Pilots and ATC expect to hear certain bits of information in certain ways and in a given order. The “phraseology”, with its particular pronunciation (for example, “fife” and “niner” instead of “five” and “nine”, so they’re not confused with each other), specific words (“Cleared to land”), international alphabet (“Mike Hotel Foxtrot”) and strict conversation rules (you must repeat, or “read back”, an instruction), needs to be learned and practised.

In spite of globalisation and the spread of English, most people around the world are not native English speakers, and an increasing number of aviation professionals do not speak English as their first language.

Native speakers have an advantage when they learn Aviation English, since they already speak English at home and in their daily lives. But they encounter many pilots or ATC who learned English as a second or even third language.

Whose responsibility is it to ensure that communication is successful? Can native speakers simply speak the way they do at home and expect to be understood? Or do they also have the responsibility to make themselves understood and to learn how to understand pilots or ATC who are not native English speakers?

As a linguist, I analyse aviation language from a linguistics perspective. I have noted the restricted meaning of the few verbs and adjectives; that the only pronouns are “you” and sometimes “we” (“How do you read?”; “We’re overhead Camden”; how few questions there are, mostly imperatives (“Maintain heading 180”); and that the syntax is so simple (no complement clauses, no relative clauses, no recursion), it might not even count as a human language for Chomsky.

But, as a pilot and a flight instructor, I look at it from the point of view of student pilots learning to use it in the cockpit while also learning to fly the airplane and navigate around the airfield.

How much harder it is to remember what to say when the workload goes up, and more difficult to speak over the radio when you know everyone else on the frequency is listening and will notice every little mistake you make?

Imagine, then, how much more difficult this is for pilots with English as a second language.

Camden Airport.
Supplied

Everyone learning another language knows it’s suddenly more challenging to hold a conversation over the phone than face-to-face, even with someone you already know. When it’s over the radio, with someone you don’t know, against the noise of the engine, static noise in the headphones, and while trying to make the plane do what you want it to do, it can be quite daunting.

No wonder student pilots who are not native English speakers sometimes prefer to stay silent, and even some experienced native English speakers will too, when the workload is too great.

This is one of the results of my research conducted in collaboration with UNSW’s Brett Molesworth, combining linguistics and aviation human factors.

Experiments in a flight simulator with pilots of diverse language backgrounds and flying experience explored conditions likely to result in pilots making mistakes or misunderstanding ATC instructions. Not surprisingly, increased workload, too much information, and rapid ATC speech, caused mistakes.

Also not surprisingly, less experienced pilots, no matter their English proficiency, made more mistakes. But surprisingly, it was the level of training, rather than number of flying hours or language background, that predicted better communication.

Once we understand the factors contributing to miscommunication in aviation, we can propose solutions to prevent them. For example, technologies such as Automatic Speech Recognition and Natural Language Understanding may help catch errors in pilot readbacks that ATC did not notice and might complement training for pilots and ATC.

It is vital that they understand each other, whatever their native language.

The Conversation

Dominique Estival, Researcher in Linguistics, Western Sydney University

This article was originally published on The Conversation. Read the original article.

Slang shouldn’t be banned … it should be celebrated, innit


image-20160502-19542-1ijaqid.jpg

Rob Drummond, Manchester Metropolitan University

Geezers and girls literally ain’t allowed to use slang words like “emosh” (emotional) anymore. The head teacher and staff of an academy in Essex, England appear to have taken great pleasure in banning the type of slang used in reality television series TOWIE, including many of the words in the above sentence, in a bid to improve the job prospects of their students.

Head teacher David Grant reportedly believes that by outlawing certain words and phrases and forcing students to use “proper English”, they will be in a better position to compete for jobs with non-native English speakers who may have a better command of the language. The way forward, he believes, is for young people to be using “the Queen’s English”, and not wasting time getting totes emosh about some bird or some bloke.

While nobody would doubt the good intentions behind such a scheme, it simply isn’t the way to go about achieving the desired aims. Of course, there’s always the possibility that this is all part of some clever plan to raise awareness and generate debate among the students about the language they use; in which case, great. Unfortunately, phrases such as “proper English”, “wrong usage” and “Queen’s English” suggest a very different and alarmingly narrow-minded approach to language.

Indeed, banning slang in schools is a short-sighted and inefficient way of trying to produce young people who are confident and adaptable communicators. What we should be doing is encouraging students to explore the fluidity, richness, and contextual appropriateness of an ever-changing language.

Slang: the real English.
Shutterstock

The fact is, there really is no such thing as “proper English”; there is simply English that is more or less appropriate in a given situation. Most of us would agree that “well jel” (very jealous) or “innit” have no place in most job interviews, but they do have a place elsewhere. Similarly, some people might get annoyed at what they see as the overuse of “like”, but it’s as much a part of young people’s language as “cool”, “yeah”, or “dude” might have been to their parents in their day.

This isn’t the first time a school has gone down this particular route in the quest to create more employable school leavers. In 2013, Harris Academy in south London produced a list of banned slang words and phrases including “bare” (alot), “innit” and “we woz” in a bid to improve their pupil’s chances. Fast forward to 2015 and the policy was hailed a success, with the “special measures” school now being rated “outstanding”. But are we really to believe that this turnaround was purely due to eager staff policing children’s use of a few slang words? Isn’t it perhaps more likely that the new leadership team brought with them rather more than a naughty words list?

Language in flux

What is always missed in these discussions is that English is in a constant state of change, and this change simply can’t be stopped. You can hang on to your belief that “literally” can only mean “in a literal manner” as much as you like, but you can’t change the fact that it has another, equally legitimate, meaning. You can disapprovingly count the number of times your teenage son or daughter says “like” in a single conversation, but you can’t stop its rise in English in general.

Which is why a ban is so pointless. All it can possibly achieve is to make young people self-conscious about the way they speak, thus stifling creativity and expression. Do we really want the shy 13-year-old who has finally plucked up the courage to speak in class to be immediately silenced when the first word he or she utters is “Like…”? Or would we rather the teacher listens to what they have to say, then explores how the use of language can change the message, depending on the context? In other words, celebrate language diversity rather than restrict it.

And this is precisely what English language teachers do every day in their classes. Learning about language variation, about accents, dialects, and slang is all part of the curriculum, especially as they head towards A level. I can only imagine how frustrated they must be when their senior staff then seek to publicly undo their good work by insisting on outdated, class-based, culturally-biased notions of correct and incorrect usage.

In an English language class, students are taught how the ways in which we use language are part of how we construct and perform our social identities. Unfortunately, their break-times are then patrolled by some kind of language police who are tasked with ensuring those identities aren’t expressed (unless, presumably, they happen to be performing an acceptably middle-class job applicant identity at the time).

Different language is appropriate for different contexts. Yes, using TOWIE slang is inappropriate in a job interview, but no more inappropriate than using the Queen’s English in the playground. Unless you’re the Queen, obvs.

The Conversation

Rob Drummond, Senior Lecturer in Linguistics, Manchester Metropolitan University

This article was originally published on The Conversation. Read the original article.

English has taken over academia: but the real culprit is not linguistic


image-20160421-26983-1wl7ai6.jpg

Anna Kristina Hultgren, The Open University and Elizabeth J. Erling, The Open University

Not only is April 23 the anniversary of William Shakespeare’s death, but the UN has chosen it as UN English Language Day in tribute to the Bard.

If growth in the number of speakers is a measure of success, then the English language certainly deserves to be celebrated. Since the end of World War I, it has risen to become the language with the highest number of non-native users in the world and is the most frequently used language among people who don’t share the same language in business, politics and academia.

In universities in countries where English is not the official language, English is increasingly used as a medium of instruction and is often the preferred language for academics in which to publish their research.

In Europe alone, the number of undergraduate and masters programmes fully taught in English grew from 2,389 in 2007 to 8,089 in 2014 – a 239% increase.

In academic publishing, the use of English has a longer history, especially in the sciences. In 1880, only 36% of publications were in English. It had risen to 50% in 1940-50, 75% in 1980 and 91% in 1996, with the numbers for social sciences and humanities slightly lower.

Today, the proportion of academic articles in the Nordic countries which are published in English is between 70% and 95%, and for doctoral dissertations it’s 80% to 90%.

Pros and cons of using English

One frequently cited advantage of publishing in English is that academics can reach a wider audience and also engage in work produced outside of their own language community. This facilitates international collaboration and, at least ideally, strengthens and validates research. In teaching, using English enables the mobility of staff and students and makes it possible for students to study abroad and get input from other cultures. It also helps develop language skills and intercultural awareness.

But some downsides have been identified. In the Nordic countries, for example, the national language councils have expressed concerns at the lack of use of national languages in academia. They’ve argued that this may impoverish these languages, making it impossible to communicate about scientific issues in Swedish, Danish, Finnish, Norwegian and Icelandic. There has also been fears that the quality of education taking place in English is lower because it may be harder to express oneself in a non-native language. And there are concerns about the creation of inequalities between those who speak English well and those who don’t – though this may begin to change.

Research suggests a more nuanced picture. National languages are still being used in academia and are no more threatened here than in other domains. Both teachers and students have been shown to adapt, drawing on strategies and resources that compensate for any perceived loss of learning. The ability to cope with education in a non-native language depends on a number of factors, such as level of English proficiency – which varies significantly across the world.

English built into the system

Some solutions to these problems have focused on devising language policies which are meant to safeguard local languages. For instance, many Nordic universities have adopted a “parallel language policy”, which accords equal status to English and to the national language (or languages, in the case of Finland, which has two official languages, Finnish and Swedish). While such initiatives may serve important symbolic functions, research suggests that they are unlikely to be effective in the long run.

Learning in Oslo – but in what language?
AstridWestvang/flickr.com, CC BY-NC-ND

This is because the underlying causes of these dramatic changes that are happening in academia worldwide are not simply linguistic, but political and economic. A push for competition in higher education has increased the use of research performance indicators and international bench-marking systems that measure universities against each other.

This competitive marketplace means academics are encouraged to publish their articles in high-ranking journals – in effect this means English-language journals. Many ranking lists also measure universities on their degree of internationalisation, which tends to be interpreted rather simplistically as the ratio of international to domestic staff and students. Turning education into a commodity and charging higher tuition fees for overseas students also makes it more appealing for universities to attract international students. This all indirectly leads to a rise in the use of English: a shared language is necessary for such transnational activities to work.

The rise of English in academia is only a symptom of this competition. If the linguistic imbalance is to be redressed, then this must start with confronting the problem of a university system which has elevated competition and performance indicators to its key organising principle, in teaching as well as research.

The Conversation

Anna Kristina Hultgren, Lecturer in English Language and Applied Linguistics, The Open University and Elizabeth J. Erling, Lecturer of English Language Teaching , The Open University

This article was originally published on The Conversation. Read the original article.