How learning a new language improves tolerance


Image 20161208 31364 1yz4g47
Why learn a new language?
Timothy Vollmer, CC BY

Amy Thompson, University of South Florida

There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.

Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.

Unfortunately, not all American universities consider learning foreign languages a worthwhile investment.

Why is foreign language study important at the university level?

As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.

This happens in two important ways.

The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”

The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”

Gaining cross-cultural understanding

Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.

Psychologist Robert Sternberg’s research on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.

Learning a foreign language reduces social anxiety.
COD Newsroom, CC BY

Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.

Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”

With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.

Dealing with the unknown

The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”

Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.

It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.

Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.

What changes with this understanding

A high tolerance of ambiguity brings many advantages. It helps students become less anxious in social interactions and in subsequent language learning experiences. Not surprisingly, the more experience a person has with language learning, the more comfortable the person gets with this ambiguity.

And that’s not all.

Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).

In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.

Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.

Language learning in higher ed

Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.

Why more universities should teach a foreign language.
sarspri, CC BY-NC

In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.

I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.

Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,

“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”

The ConversationConsidering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”

Amy Thompson, Associate Professor of Applied Linguistics, University of South Florida

This article was originally published on The Conversation. Read the original article.

 

Advertisements

Language puts ordinary people at a disadvantage in the criminal justice system


File 20170817 13465 1lhwsd6
‘Now, did you understand all that?’
Shutterstock

David Wright, Nottingham Trent University

Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.

Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.

But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.

The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.

It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:

You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.

This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.

What the research says

Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.

This issue of comprehensibility is compounded, of course, when the detainee is not a native speaker of English.

The word of the law.
Shutterstock

The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.

However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.

Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.

The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.

Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.

The ConversationLanguage will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.

David Wright, Lecturer in Linguistics, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

English language bar for citizenship likely to further disadvantage refugees


File 20170424 24654 1oq9mss
Prime Minister Malcolm Turnbull has proposed tougher language requirements for new citizenship applicants.
Lukas Coch/AAP

Sally Baker, University of Newcastle and Rachel Burke, University of Newcastle

Citizenship applicants will need to demonstrate a higher level of English proficiency if the government’s proposed changes to the Australian citizenship test go ahead.

Applicants will be required to reach the equivalent of Band 6 proficiency of the International English Language Testing System (IELTS).

To achieve Band 6, applicants must correctly answer 30 out of 40 questions in the reading paper, 23 out of 40 in the listening paper, and the writing paper rewards language used “accurately and appropriately”. If a candidate’s writing has “frequent” inaccuracies in grammar and spelling, they cannot achieve Band 6

Success in IELTS requires proficiency in both the English language, and also understanding how to take – and pass – a test. The proposed changes will then make it harder for people with fragmented educational backgrounds to become citizens, such as many refugees.

How do the tests currently work?

The current citizenship test consists of 20 multiple-choice questions in English concerning Australia’s political system, history, and citizen responsibilities.

While the test does not require demonstration of English proficiency per se, it acts as an indirect assessment of language.

For example, the question: “Which official symbol of Australia identifies Commonwealth property?” demonstrates the level of linguistic complexity required.

The IELTS test is commonly taken for immigration purposes as a requirement for certain visa categories; however, the designer of IELTS argues that IELTS was never designed for this purpose. Researchers have argued that the growing strength of English as the language of politics and economics has resulted in its widespread use for immigration purposes.

Impact of proposed changes

English is undoubtedly important for participation in society, but deciding citizenship based on a high-stakes language test could further marginalise community members, such as people with refugee backgrounds who have the greatest need for citizenship, yet lack the formal educational background to navigate such tests.

The Refugee Council of Australia argues that adults with refugee backgrounds will be hardest hit by the proposed language test.

Data shows that refugees are both more likely to apply for citizenship, and twice as likely as other migrant groups to have to retake the test.

Mismatched proficiency expectations

The Adult Migrant English Program (AMEP), where many adult refugees access English learning upon arrival, expects only a “functional” level of language proficiency.

For many adult refugees – who have minimal first language literacy, fragmented educational experiences, and limited opportunities to gain feedback on their written English – “competency” may be prohibitive to gaining citizenship. This is also more likely to impact refugee women, who are less likely to have had formal schooling and more likely to assume caring duties.

Bar too high?

The challenges faced in re/settlement contexts, such as pressures of work and financial responsibilities to extended family, often combine to make learning a language difficult, and by extension,
prevent refugees from completing the citizenship test.

Similar patterns are evident with IELTS. Nearly half of Arabic speakers who took the IELTS in 2015 scored lower than Band 6.

There are a number of questions to clarify regarding the proposed language proficiency test:

  • Will those dealing with trauma-related experiences gain exemption from a high-stakes, time-pressured examination?
  • What support mechanisms will be provided to assist applicants to study for the test?
  • Will financially-disadvantaged members of the community be expected to pay for classes/ materials in order to prepare for the citizenship test?
  • The IELTS test costs A$330, with no subsidies available. Will the IELTS-based citizenship/ language test attract similar fees?

There are also questions about the fairness of requiring applicants to demonstrate a specific type and level of English under examination conditions that is not required of all citizens. Those born in Australia are not required to pass an academic test of language in order to retain their citizenship.

Recognising diversity of experiences

There are a few things the government should consider before introducing a language test:

1) Community consultation is essential. Input from community/ migrant groups, educators, and language assessment specialists will ensure the test functions as a valid evaluation of progression towards English language proficiency. The government is currently calling for submissions related to the new citizenship test.

2) Design the test to value different forms and varieties of English that demonstrate progression in learning rather than adherence to prescriptive standards.

3) Provide educational opportunities that build on existing linguistic strengths that help people to prepare for the test.

The ConversationEquating a particular type of language proficiency with a commitment to Australian citizenship is a complex and ideologically-loaded notion. The government must engage in careful consideration before potentially further disadvantaging those most in need of citizenship.

Sally Baker, Research Associate, Centre of Excellence for Equity in Higher Education, University of Newcastle and Rachel Burke, Lecturer, University of Newcastle

This article was originally published on The Conversation. Read the original article.

Is there such a thing as a national sense of humour?


File 20170503 21630 1p64l4e
A statue celebrating Monty Python’s sketch The Dead Parrot near London’s Tower Bridge ahead of a live show on the TV channel Gold.
DAVID HOLT/Flickr, CC BY-SA

Gary McKeown, Queen’s University Belfast

We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual? The Conversation

There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.

Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.

Just for laughs.

These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.

Language and culture

Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.

Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.

Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.

International humour?
C.P.Storm/Flickr, CC BY-SA

Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.

Denigration and self-deprecation

There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.

Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.

Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.

Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.

Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.

‘Fork handles’.

For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.

These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.

A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.

Gary McKeown, Senior Lecturer of Psychology, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Accessible, engaging textbooks could improve children’s learning


Image 20170313 9408 bb6pp1
It’s not enough for textbooks just to be present in a classroom. They must support learning.
Global Partnership for Education/Flickr, CC BY-NC-ND

Lizzi O. Milligan, University of Bath

Textbooks are a crucial part of any child’s learning. A large body of research has proved this many times and in many very different contexts. Textbooks are a physical representation of the curriculum in a classroom setting. They are powerful in shaping the minds of children and young people. The Conversation

UNESCO has recognised this power and called for every child to have a textbook for every subject. The organisation argues that

next to an engaged and prepared teacher, well-designed textbooks in sufficient quantities are the most effective way to improve instruction and learning.

But there’s an elephant in the room when it comes to textbooks in African countries’ classrooms: language.

Rwanda is one of many African countries that’s adopted a language instruction policy which sees children learning in local or mother tongue languages for the first three years of primary school. They then transition in upper primary and secondary school into a dominant, so-called “international” language. This might be French or Portuguese. In Rwanda, it has been English since 2008.

Evidence from across the continent suggests that at this transition point, many learners have not developed basic literacy and numeracy skills. And, significantly, they have not acquired anywhere near enough of the language they are about to learn in to be able to engage in learning effectively.

I do not wish to advocate for English medium instruction, and the arguments for mother-tongue based education are compelling. But it’s important to consider strategies for supporting learners within existing policy priorities. Using appropriate learning and teaching materials – such as textbooks – could be one such strategy.

A different approach

It’s not enough to just hand out textbooks in every classroom. The books need to tick two boxes: learners must be able to read them and teachers must feel enabled to teach with them.

Existing textbooks tend not to take these concerns into consideration. The language is too difficult and the sentence structures too complex. The paragraphs too long and there are no glossaries to define unfamiliar words. And while textbooks are widely available to those in the basic education system, they are rarely used systematically. Teachers cite the books’ inaccessibility as one of the main reasons for not using them.

A recent initiative in Rwanda has sought to address this through the development of “language supportive” textbooks for primary 4 learners who are around 11 years old. These were specifically designed in collaboration with local publishers, editors and writers.

Language supportive textbooks have been shown to make a difference in some Rwandan classrooms.

There are two key elements to a “language supportive” textbook.

Firstly, they are written at a language level which is appropriate for the learner. As can be seen in Figure 1, the new concept is introduced in as simple English as possible. The sentence structure and paragraph length are also shortened and made as simple as possible. The key word (here, “soil”) is also repeated numerous times so that the learner becomes accustomed to this word.

University of Bristol and the British Council

Secondly, they include features – activities, visuals, clear signposting and vocabulary support – that enable learners to practice and develop their language proficiency while learning the key elements of the curriculum.

The books are full of relevant activities that encourage learners to regularly practice their listening, speaking, reading and writing of English in every lesson. This enables language development.

Crucially, all of these activities are made accessible to learners – and teachers – by offering support in the learners’ first language. In this case, the language used was Kinyarwanda, which is the first language for the vast majority of Rwandan people. However, it’s important to note that initially many teachers were hesitant about incorporating Kinyarwanda into their classroom practice because of the government’s English-only policy.

Improved test scores

The initiative was introduced with 1075 students at eight schools across four Rwandan districts. The evidence from our initiative suggests that learners in classrooms where these books were systematically used learnt more across the curriculum.

When these learners sat tests before using the books, they scored similar results to those in other comparable schools. After using the materials for four months, their test scores were significantly higher. Crucially, both learners and teachers pointed out how important it was that the books sanctioned the use of Kinyarwanda. The classrooms became bilingual spaces and this increased teachers’ and learners’ confidence and competence.

All of this supports the importance of textbooks as effective learning and teaching materials in the classroom and shows that they can help all learners. But authorities mustn’t assume that textbooks are being used or that the existing books are empowering teachers and learners.

Textbooks can matter – but it’s only when consideration is made for the ways they can help all learners that we can say that they can contribute to quality education for all.

Lizzi O. Milligan, Lecturer in International Education, University of Bath

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Tough immigration laws are hitting Britain’s curry houses hard


Image 20170223 32705 au4loj
shutterstock

Emily Falconer, University of Westminster

The British curry industry is responsible for 100,000 jobs and contributes more than £4 billion to the UK economy. But it’s now feared that up to a third of all Indian restaurants could disappear because of tougher immigration laws. The Conversation

The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.

These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.

Fusion food

Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.

It’s been predicted that more than half of all curry houses may shut down within ten years.
Shutterstock

“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.

As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.

In his book Spicing Up Britain: The Multicultural History of British Food, Panayai charts the patterns of migration and the influences of food, taste and consumption habits. He follows the tastes of British Asians who have grown up with a fusion of tastes and influences all their life.

These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.

Merging of tastes

Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.

In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.

Fish and chips and curry sauce – a British speciality.
Flickr/Liz Barker, CC BY-NC

These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.

These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.

More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.

‘One spicy crab coming right up’.
Shutterstock

The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.

Multicultural Britain

But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.

As diverse as the food on our plates.
Shutterstock

This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.

Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.

Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.

Emily Falconer, Lecturer in Sociology, University of Westminster

This article was originally published on The Conversation. Read the original article.