What language tells us about changing attitudes to extremism


File 20170831 22559 1c9xlce.jpg?ixlib=rb 1.1
Words are more than their dictionary definition.
Amir Ridhwan/Shutterstock

Josie Ryan, Bangor University

The words “extreme”,“extremist” and “extremism” carry so many connotations these days – far more than a basic dictionary definition could ever cover. Most would agree that Islamic State, the London Bridge and Manchester Arena attackers, as well as certain “hate preachers” are extremists. But what about Darren Osbourne who attacked the Finsbury Park Mosque? Or Thomas Mair who murdered Labour MP Jo Cox? Or even certain media outlets and public figures who thrive on stirring up hatred between people? Their acts are hateful and ideologically-driven, but calls for them to be described in the same terms as Islamic extremists are more open to debate.

The word “extreme” comes from the Latin (as so many words do) “extremus”, meaning, literally, far from the centre. But the words “extremist” and “extremism” are relatively new to the English language.

Much language is metaphorical, especially when we talk about abstract things, such as ideas. So, when we use “extreme” metaphorically, we mean ideas and behaviour that are not moderate and do not conform to the mainstream. These are meanings we can find in a dictionary, but this is not necessarily how or when extreme, extremist, and extremism are used in everyday life.

Lingua

One way of finding out how words are used is to look at massive databases of language, called corpora. To find out more about how these words developed in Britain, I turned to the Hansard corpus, a collection of parliament speeches, from 1803 to 2005. Political language is quite specific, but analysing it is a good way to see how the issues of the day are being described. In addition, having a record which covers two centuries shows us how words and their meanings have changed over time.

Apart from the adverb “extremely” – used in the same way as “really” and “very” – my search showed that the word extreme was used most frequently in its adjective form during this 200-year period. However, usage of extreme as an adjective has been declining since the mid-1800s, as has the noun form. At the same time, two new nouns, “extremist” and “extremism” begin to appear in the corpus in the late 1800s, and usage gradually increases as time goes on. No longer are certain views and opinions described as extreme, instead extremist and extremism are used as a shorthand for complex ideas, characteristics, processes and even people.

https://datawrapper.dwcdn.net/hI734/1/

In the graph above, we can see three peaks in the frequency of the noun extremist(s). It is interesting to see which groups have been labelled as extremist in the past as this can provide clues about who is considered an extremist these days, and also who is not.

In the 1920s, extremist and extremism were often used in connection with the Irish and Indian fights for independence from the British Empire. 50 years on, they are linked with another particularly violent period in Irish history, while Rhodesia was also fighting for independence from Britain in the 1970s. The final increase in usage of the terms extremist and extremism comes, perhaps unsurprisingly, at the start of the 21st century.

However, the words have not been solely linked to violence: they were very often used to describe miners in the 1920s and animal rights activists in the 2000s. Both of these groups have had a lot of support from the British population if not from politicians speaking in parliament.

I also looked at the words that appear around the extreme words, or “collocates”. What I found is that the collocates of the search terms become increasingly negative over the period covered in the Hansard corpus. They also became less connected to situations, and more closely connected to political or religious ideas and violence. For example, in the late 20th century and early 2000s, “extremism” became more associated with Islam, and at the same time, it was collocated with words such as “threat”, “hatred”, “attack”, “terror”, “evil”, “destroy”, “fight”, and “xenophobic”.

Extremism

After 2005, the extremist terms became much more frequently associated with the Islamic faith – to the point where the word “extremist” is now almost exclusively used to refer to a Muslim who has committed a terrorist act, and some have suggested there is reluctance to use it otherwise.

Looking at the collocates of extremist and extremism in a corpus of UK web news, which runs from 2010-2017, five of the top 10 collocates are related to Islam. “Right wing” and “far-right” also appear in the top 10. However, the top three collocates – “Islamic”, “Islamist” and “Muslim” – appear 50% more frequently than the other seven collocates in the list added together.

The most interesting thing to come out of this investigation is what has gone unsaid. Extremist and extremism are not being used as they were in the past to describe violent, hateful, and ideologically-driven acts, with no reference to ethnicity or faith. Today, the terms have become almost solely reserved for use in reference to Muslims who perpetrate terrorist attacks.

The ConversationThe words we use can affect and reveal how we perceive the world around us. Word meanings change over time, but reluctance to use the same word for the same behaviour betrays a bias towards crimes that are, perhaps, uncomfortably mainstream.

Josie Ryan, PhD Researcher, Bangor University

This article was originally published on The Conversation. Read the original article.

 

Advertisements

British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

Why your ability to speak English could be judged on how you look


Image 20170111 4553 19wybfu.jpg?ixlib=rb 1.1
The idea of a ‘native speaker’ creates bias.
Gustavo Frazao/www.shutterstock.com

Mario Saraceni, University of Portsmouth

When MPs published a set of new proposals in early January on ways to improve the integration of immigrants in the UK, one of the most controversial concerned speaking English. The report from the All Party Parliamentary Group on Social Integration, chaired by the Labour MP Chuka Umunna, proposed that:

All immigrants should be expected to have either learned English before coming to the UK or be enrolled in compulsory ESOL [English for Speakers of Other Languages] classes upon arrival.

The assumption here is that migrants don’t speak English, or not well enough to integrate within British society. However, this disregards the fact that English is a global language, spoken by up to two billion multilingual people around the world. It is a case of what the linguist Adrian Holliday termed “native-speakerism” – the belief in the linguistic superiority of “native speakers” and the consequent discrimination of those who are considered “non-native speakers”.

But the distinction between native speakers and non-native speakers is often not just based on actual language skills, but, far more disturbingly, on assumptions based on ethnicity.

Language and look

An example of the sensitivities around this issue were clearly on display during a Channel 4 news report on the MPs’ report on January 5.

The report included brief interviews with two women currently living in the UK, one from Somalia and the other from The Gambia. They were meant to represent examples of the particular category of migrants requiring compulsory English language classes under the new proposals. But both women spoke good English when answering the interviewer’s questions about the current lack of English class provision.

The two women were from Sub-Saharan Africa, were of an ethnicity that was visibly different from that of a typical white British person and, in the case of the Gambian woman, wore clothes that marked non-Judaeo-Christian cultural and religious affiliations. They not only represented migrants needing English language tuition but prototypical migrants, exhibiting suitably “exotic” traits that identified them as such.

There are two widespread underlying assumptions here that need to be challenged. The first relates to the ways in which national identity is often understood in terms of ethnicity. As the cultural scholar Paul Gilroy remarked in his book There Ain’t No Black in the Union Jack, in Britain “conceptions of national belonging and homogeneity … not only blur the distinction between ‘race’ and nation, but rely on that very ambiguity for their effect”.

The second assumption is that there is an exclusive bond between a language and the nation it naturally belongs to – a point I have discussed previously on The Conversation.

Speaking with legitimacy

In turn, these two assumptions combined produce a third one: only those who are the rightful members of a nation, by birth and by race, are legitimate speakers of the language of that nation.

Will this do?
Elena Rostunova/www.shutterstock.com

Citing a number of research studies, Holliday made the point that native-speakerism is not a matter of language alone, but is closely connected to ethnicity and race, even though this connection is rarely made explicit.

It produces situations of great inequality around the world. In the field of English language teaching, for example, it is not uncommon for jobs to be available exclusively to native speakers (sometimes explicitly defined as “white” or “Caucasian”), or for non-native speakers to receive significantly lower remuneration even when they possess higher qualifications.

This kind of inequity affects migrants in general. A 2011 study found that
African migrants’ “accents and varieties of English had been
treated as inferior by ‘native speakers’ in traditional English speaking countries.” According to the study, migrants were also:

Made to feel as if they were unproficient in English, weak in communication skills, or unintelligible. They got the impression that only speaking in the prestige/native varieties of English counted for proficiency and educational or professional success.

Native-speakerism is also intertwined with a colonial view of the world where the coloniser is attributed with cultural superiority over the colonised. This mentality has persisted well after the end of colonialism and is so pervasive that it affects the ways non-native speakers see themselves too – as inadequate and defective users of English. Tellingly, the Gambian woman in the Channel 4 segment said, “I don’t have confidence for myself to speak English” – summing up this mindset exactly.

The ConversationIt is important that we become more conscious of the fact that English is not just an “English” language and that the ability to speak it has nothing to do with how “English” a person looks or behaves.

Mario Saraceni, Senior Lecturer in English Language and Linguistics, University of Portsmouth

This article was originally published on The Conversation. Read the original article.

 

What will the English language be like in 100 years?


Image 20151109 29321 1xynuif.jpg?ixlib=rb 1.1
Ever evolving.
Feng Yu/www.shutterstock.com

Simon Horobin, University of Oxford

One way of predicting the future is to look back at the past. The global role English plays today as a lingua franca – used as a means of communication by speakers of different languages – has parallels in the Latin of pre-modern Europe.

Having been spread by the success of the Roman Empire, Classical Latin was kept alive as a standard written medium throughout Europe long after the fall of Rome. But the Vulgar Latin used in speech continued to change, forming new dialects, which in time gave rise to the modern Romance languages: French, Spanish, Portuguese, Romanian and Italian.

Similar developments may be traced today in the use of English around the globe, especially in countries where it functions as a second language. New “interlanguages” are emerging, in which features of English are mingled with those of other native tongues and their pronunciations.

Despite the Singaporean government’s attempts to promote the use of Standard British English through the Speak Good English Movement, the mixed language known as “Singlish” remains the variety spoken on the street and in the home.

Spanglish, a mixture of English and Spanish, is the native tongue of millions of speakers in the United States, suggesting that this variety is emerging as a language in its own right.

Meanwhile, the development of automatic translation software, such as Google Translate, will come to replace English as the preferred means of communication employed in the boardrooms of international corporations and government agencies.

So the future for English is one of multiple Englishes.

Looking back to the early 20th century, it was the Standard English used in England, spoken with the accent known as “Received Pronunciation”, that carried prestige.

But today the largest concentration of native speakers is in the US, and the influence of US English can be heard throughout the world: can I get a cookie, I’m good, did you eat, the movies,_ “skedule”_ rather than “shedule”. In the future, to speak English will be to speak US English.

US spellings such as disk and program are already preferred to British equivalents disc and programme in computing. The dominance of US usage in the digital world will lead to the wider acceptance of further American preferences, such as favorite, donut, dialog, center.

What is being lost?

In the 20th century, it was feared that English dialects were dying out with their speakers. Projects such as the Survey of English Dialects (1950-61) were launched at the time to collect and preserve endangered words before they were lost forever. A similar study undertaken by the BBC’s Voices Project in 2004 turned up a rich range of local accents and regional terms which are available online, demonstrating the vibrancy and longevity of dialect vocabulary.

But while numerous dialect words were collected for “young person in cheap trendy clothes and jewellery” – pikey, charva, ned, scally – the word chav was found throughout England, demonstrating how features of the Estuary English spoken in the Greater London area are displacing local dialects, especially among younger generations.

The turn of the 20th century was a period of regulation and fixity – the rules of Standard English were established and codified in grammar books and in the New (Oxford) English Dictionary on Historical Principles, published as a series of volumes from 1884-1928. Today we are witnessing a process of de-standardisation, and the emergence of competing norms of usage.

In the online world, attitudes to consistency and correctness are considerably more relaxed: variant spellings are accepted and punctuation marks omitted, or repurposed to convey a range of attitudes. Research has shown that in electronic discourse exclamation marks can carry a range of exclamatory functions, including apologising, challenging, thanking, agreeing, and showing solidarity.

Capital letters are used to show anger, misspellings convey humour and establish group identity, and smiley-faces or emoticons express a range of reactions.

Getting shorter

Some have questioned whether the increasing development and adoption of emoji pictograms, which allow speakers to communicate without the need for language, mean that we will cease to communicate in English at all? 😉

The fast-changing world of social media is also responsible for the coining and spreading of neologisms, or “new words”. Recent updates to Oxford Dictionaries give a flavour: mansplaining, awesomesauce, rly, bants, TL;DR (too long; didn’t read).

How Oxford Dictionaries choose which new words to include.

Clipped forms, acronyms, blends and abbreviations have long been productive methods of word formation in English (think of bus, smog and scuba) but the huge increase in such coinages means that they will be far more prominent in the English of 2115.

The ConversationWhether you 👍 or h8 such words, think they are NBD or meh, they are undoubtedly here to stay.

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

To write English like a professor, don’t rely on Google translate


Qgtrcmx9 1401443680
You haven’t used ‘stakeholder’ enough.
Professor and student via Tyler Olson/Shutterstock

Mike Groves, University of Nottingham

Thankfully, nobody speaks academic English as a first language. The English of the university is a very particular form that has specific features and conventions. Sometimes, this is just referred to as “academic style”.

It used to be a matter of instinct – what felt right. But now a large amount of research is using a “big data” approach to analyse millions of words of academic writing. This has resulted in projects such as the Academic Word List, the 570 most commonly used words in academic text across disciplines, (excluding the 2,000 most common words in English).

Other research has looked at what words and phrases are often used to thread the various parts of a structure together. These are known as “lexical bundles”, and reflect the view that a language doesn’t have a separate grammar and vocabulary – rather the two are combined into a “lexico-grammar”.

Every university which teaches in English will have an English language teaching department to support students whose first language is not English. These departments vary in size and prominence, but what they all share is that they purport to teach “English for academic purposes”.

What’s acceptable?

There is a tension within the field about what type of English should be acceptable. It is recognised that English is no longer solely the language of the British, North Americans and Australasians. English is spoken as a first or second language in countries such as Kenya and Malaysia, and is being learnt as a lingua franca by vast numbers of people across the world, from Argentina to China.

This has led to a re-conceptualisation of English as a language without a centre, and with a much greater flexibility in terms of grammatical accuracy than a single standard. This makes enormous sense when a Pole is talking to a Vietnamese person in a social setting: the niceties of grammar do not matter, the message does.

However, academic writing is generally done under high-stakes conditions, on which students’ degree are judged. This leads to a tension between encouraging students to write with clarity, with accuracy or with fluency.

If we tell our students “the message, not the grammar, matters” and the person marking their dissertations insists that the correct form is “the data suggest” not “the data suggests”, we are doing them a disservice.

This idea of a grammatical normality becomes even more complex when we bring in students who are using English as a second language, but often in highly localised forms.

For example, a Malaysian student whose English is perfectly functional, but includes very few auxiliary verbs such as “does” and “am”, may come to a course on English for academic purposes, to be told that the form of English he speaks, and has spoken since childhood is “wrong”.

Some see this as the equivalent of an American studying in Britain being told she must stop using words such as “candy” and “mailman” – surely an intolerable intrusion into her linguistic identity. Or, should we teach students that they must, at least in the academic context, speak and write within the grammatical norms of the UK? This seems like an unacceptably colonial attitude.

The solution most practitioners and writers come up with is to allow the students the choice. To do this, they need to allow them to see and analyse the norms of academic writing. By doing this, the control of what is acceptable moves from the teacher to the students, allowing them a much larger amount of autonomy in their writing.

This also encourages students to think of themselves as member of the academic community, with their own academic self, and academic voice. This is reflected in the shift in title from “teacher” or “tutor” to “language adviser”, seen in many English for academic purposes departments.

The coming of Google translate

Another issue facing the sector is the issue of free online automatic translations, such as Google translate. It could be argued that these could put the English for academic purposes departments out of business: why would a student go to the expense and effort of learning English when he or she can just translate in and out of his first language online?

But this argument relies on the assumption that academic English relies on a surface level of literacy, based purely on grammatical accuracy. What Google translate is unable to do is teach the deeper academic literacies needed by students for full engagement with the academic community. It is my prediction that machine translation will inevitably change the way students are taught English – but it will not replace it.

Alongside these debates, it’s important to remember that this type of English teaching is all about language being used to express complex ideas, and it is impossible to teach without a certain complexity and depth of content. It is also about aligning to the intellectual norms of academia, which generally involves a large amount of analysis and picking apart of concepts within a rigorous intellectual framework.

We show respect to the intellectuals by trying to deconstruct their work. However, many students will come to an English for academic writing course from a radically different intellectual background, one where school, or even university has been more focused on understanding and reworking the ideas of the established experts, and respecting their work by not trying to deconstruct it.

The ConversationTherefore, the job of the English for academic purposes teacher gains an extra layer. They need to encourage a different type of thinking in the students, in order to enable them to take part in a different type of writing.

Mike Groves, Head of the Centre for English Language Education, Malaysia Campus, University of Nottingham

This article was originally published on The Conversation. Read the original article.

 

Brexit could create a new ‘language’ – Euro-English


The EU may develop its own unique form of English and could swing the global balance on American versus English spellings of words like ‘harbor’ and ‘organization’

Ian Johnston Science Correspondent. 

grammar.jpg

Brexit could lead to the development of a new form of the English language, according to a new academic paper.

Dr Marko Modiano, of Gavle University in Sweden, said there were already signs that “Euro-English” was developing its own distinct way of speaking.

And this could eventually be codified in a dictionary and taught in schools in much the same way that American or Australian English is today if English is retained as the lingua franca of the European Union after the UK leaves.

The Europeans might also decide to adopt American spellings, Dr Modiano said, which would add about 443 million to the total population using that system.

Euro-English has already developed its own new definitions for some words based on the “Eurospeak” deployed in Brussels.

For example, “eventual” is now used as “a synonym for possible or possibly”, Dr Modiano wrote in the journal World Englishes.

“Subsidiarity” has come to mean “the principle that legal decrees should be enacted as close to people as possible”; “Berlaymont” means “bureaucracy”, “conditionality” means “conditions”; and “semester” is “used to mean six months”.

“The use of eventual as a synonym for possible or possibly is actually showing signs of being accepted and may, in the near future, be considered a feature of Euro-English,” Dr Modiano said.

Grammar is also changing.

For example, “I am coming from Spain” can be used to mean “I come from Spain” as part of an expansion of the use of -ing forms of verbs.

And “we were five people at the party” means the same as “there were five people at the party”.

“In my observations, continental Europeans speaking English as a [second language] readily use this construction, we were, instead of there were, and seem comfortable with its use and meaning,” Dr Modiano said.

He argued that English was likely to remain as the EU lingua franca despite suggestions it should be ditched with no member state having its as their official language. Ireland chose Irish and Malta chose Maltese despite widespread use of English in both countries.

The EU would have only about five million native speakers of English after Brexit, representing about 1 per cent of population.

While politicians on the left and right in France has seen Brexit as a chance to reinforce the global status of French, other countries would be reluctant to switch to French as their main second language, given the widespread use of English on the world stage and their investment in teaching it.

But without native English speakers from the UK to police its linguist rules, Euro-English could develop a life of its own.

“When taking on the role of language guardian, and to some respect being given that role by continental European language specialists, the British met little resistance because of their considerable numbers, their heritage as the founders of the language, as well as the very fact that they are [first language] users of English,” Dr Modiano wrote.

“In this capacity, they have been successful in establishing the understanding that their version of the language, standardised British English with RP pronunciation, is the more esteemed form of the English language across the globe.

“With the British gone, no one will be there to carry on the work of defending the structural integrity of British English in the face of competition from not only American English, but also from [second language] users who increasingly utilise features indicative of discoursal nativisation which are in the processes of becoming systematic across continental Europe.”

He said Europeans “may well debate the pros and cons” of American and English spellings “without being influenced by ‘native speakers’ of either variety”.

“It is conceivable that the American-English spelling system may be deemed more utilitarian. That some 70 per cent of ‘native speakers’ use this spelling convention, which dominates the Internet, further strengthens the argument to implement it for Europe as well,” Dr Modiano said.

Euro-English could help provide its users with a “sense of identity” among other benefits which were “both logical and welcome”.

“In the act of recognising the validity of Euro-English,” Dr Modiano wrote, “one liberates continental European [second language] users of English from the tyranny of standard language ideology.”

(Original Article-http://www.independent.co.uk/news/science/brexit-latest-news-language-euro-english-uk-leave-eu-european-union-a7957001.html)

 

EMI ‘doesn’t work’ for adults with weak second language skills


September 2017 (The EL Gazette)

EMI ‘doesn’t work’ for adults with weak second language skills

Teach an adult a foreign language and content simultaneously, and they’ll learn very little in either, research suggests.

Teaching academic content in a foreign language without explicit language support can be detrimental to the learning of both the subject and the foreign tongue, researchers have said.

Practices such as English as a medium of instruction (EMI) could be ineffective compared to learning language and content separately unless students have already mastered their foreign language or receive targeted support, a new study warns.

‘It’s safe to say that EMI doesn’t work, from the data we got,’ Dr John Sweller, one of the authors of the study, told the Gazette. ‘If you try to teach adults a second language simultaneously with content, they’ll end up with very little learning in either.’

The authors explain that learning a foreign language as an adult means acquiring ‘biologically secondary knowledge’, which requires explicit instruction and ‘conscious effort’. Cognitive load theory, the study says, suggests that learning two things at the same time won’t work – and the data seems to confirm this.

Over three experiments, each with control conditions, researchers found that reading a text only in the foreign language led to the worst outcomes for both content and language learning. In all three experiments, which involved a total of 294 French university students, participants were divided into three groups. A text about a topic relevant to the students’ course was given to the first group in the foreign language, to the second group in French, and to the third in both the foreign language and French. The foreign language was German for the three groups in the first experiment and English for the three groups of the other two experiments. The students, who averaged a CEFR B1 in the foreign language, were then asked to study the texts and to perform three tests afterwards: two language and translation tests and one content test with questions in French.

Unsurprisingly, the French-only conditions yielded the best results for the content test. However, results were higher in the language and translation tests for students who read the text in both languages. The full ‘immersion’ conditions obtained the worst results in all three experiments.

Language proficiency is paramount for EMI to lead to successful learning, the authors said. Dr Sweller explained, ‘Ideally, you should not learn content in the L2 until your language levels are so high that you don’t have to consciously think about the language. Until then, according to everything we know about human cognition, you are going to struggle – both in the language and in the content.’

The idea is that we have a limited pool of resources for learning, and the brain struggles to learn two things at the same time for this reason. Co-author Dr Andre Tricot added, ‘All the resources you use to deal with the language are unavailable to learn the content.’
Universities should put in place extra support to ensure students don’t struggle with lessons delivered in English, the authors say. Co-author Dr Danielle Joulia said students would benefit from targeted language support that covers the vocabulary of the topic they are studying, ideally before each class. ‘General academic English support is not enough,’ she insisted.


Just like Clil at primary and secondary level, EMI is gaining ground in higher education around the world (see our EMI special). Although Clil has more focus on pedagogical support in the foreign language, both methodologies aim to teach content and language simultaneously. This approach doesn’t seem to tally with the findings of this study, and with cognitive load theory in general. ‘The idea that we have a limited pool of resources [see main piece] is common knowledge,’ said Dr Sweller. ‘It’s incomprehensible that there is such a divide between research and practice.’ But the divide is not only between research and practice, but also between research and policy, the authors said. ‘There is a lot of politics around internationalisation.

Much of these procedures [such as EMI] have been brought in without any prior research whatsoever,’ Dr Sweller said. And research is still lacking, with very few randomised control trials, he added.
For Dr Stephanie Roussel, the lead author of this study, there should also be a cultural perspective on the issue. ‘EMI and Clil seem to work well in countries such as Canada and Belgium, which are bilingual,’ she explained. ‘Maybe implementing such an approach in a different cultural context, such as broadly monolingual France, won’t yield the same results without strong second language support.’