How learning a new language improves tolerance


Image 20161208 31364 1yz4g47
Why learn a new language?
Timothy Vollmer, CC BY

Amy Thompson, University of South Florida

There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.

Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.

Unfortunately, not all American universities consider learning foreign languages a worthwhile investment.

Why is foreign language study important at the university level?

As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.

This happens in two important ways.

The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”

The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”

Gaining cross-cultural understanding

Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.

Psychologist Robert Sternberg’s research on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.

Learning a foreign language reduces social anxiety.
COD Newsroom, CC BY

Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.

Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”

With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.

Dealing with the unknown

The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”

Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.

It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.

Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.

What changes with this understanding

A high tolerance of ambiguity brings many advantages. It helps students become less anxious in social interactions and in subsequent language learning experiences. Not surprisingly, the more experience a person has with language learning, the more comfortable the person gets with this ambiguity.

And that’s not all.

Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).

In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.

Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.

Language learning in higher ed

Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.

Why more universities should teach a foreign language.
sarspri, CC BY-NC

In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.

I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.

Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,

“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”

The ConversationConsidering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”

Amy Thompson, Associate Professor of Applied Linguistics, University of South Florida

This article was originally published on The Conversation. Read the original article.

 

Advertisements

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Tough immigration laws are hitting Britain’s curry houses hard


Image 20170223 32705 au4loj
shutterstock

Emily Falconer, University of Westminster

The British curry industry is responsible for 100,000 jobs and contributes more than £4 billion to the UK economy. But it’s now feared that up to a third of all Indian restaurants could disappear because of tougher immigration laws. The Conversation

The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.

These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.

Fusion food

Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.

It’s been predicted that more than half of all curry houses may shut down within ten years.
Shutterstock

“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.

As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.

In his book Spicing Up Britain: The Multicultural History of British Food, Panayai charts the patterns of migration and the influences of food, taste and consumption habits. He follows the tastes of British Asians who have grown up with a fusion of tastes and influences all their life.

These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.

Merging of tastes

Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.

In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.

Fish and chips and curry sauce – a British speciality.
Flickr/Liz Barker, CC BY-NC

These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.

These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.

More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.

‘One spicy crab coming right up’.
Shutterstock

The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.

Multicultural Britain

But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.

As diverse as the food on our plates.
Shutterstock

This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.

Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.

Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.

Emily Falconer, Lecturer in Sociology, University of Westminster

This article was originally published on The Conversation. Read the original article.

What’s the point of education if Google can tell us anything?


gapps-edu

Ibrar Bhatt, Lancaster University

Can’t remember the name of the two elements that scientist Marie Curie discovered? Or who won the 1945 UK general election? Or how many light years away the sun is from the earth? Ask Google.

Constant access to an abundance of online information at the click of a mouse or tap of a smartphone has radically reshaped how we socialise, inform ourselves of the world around us and organise our lives. If all facts can be summoned instantly by looking online, what’s the point of spending years learning them at school and university? In the future, it might be that once young people have mastered the basics of how to read and write, they undertake their entire education merely through accessing the internet via search engines such as Google, as and when they want to know something.

Some educational theorists have argued that you can replace teachers, classrooms, textbooks and lectures by simply leaving students to their own devices to search and collect information about a particular topic online. Such ideas have called into question the value of a traditional system of education, one in which teachers simply impart knowledge to students. Of course, others have warned against the dangers of this kind of thinking and the importance of the teacher and human contact when it comes to learning.

Such debate about the place and purpose of online searching in learning and assessments is not new. But rather than thinking of ways to prevent students from cheating or plagiarising in their assessed pieces of work, maybe our obsession with the “authenticity” of their coursework or assessment is missing another important educational point.

Digital content curators

In my recent research looking at the ways students write their assignments, I found that increasingly they may not always compose written work which is truly “authentic”, and that this may not be as important as we think. Instead, through prolific use of the internet, students engaged in a number of sophisticated practices to search, sift, critically evaluate, anthologise and re-present pre-existing content. Through a close examination of the moment-by-moment work of the way students write assignments, I came to see how all the pieces of text students produced contained elements of something else. These practices need to be better understood and then incorporated into new forms of education and assessment.

These online practices are about harnessing an abundance of information from a multitude of sources, including search engines like Google, in what I call a form of “digital content curation”. Curation in this sense is about how learners use existing content to produce new content through engaging in problem-solving and intellectual inquiry, and creating a new experience for readers.

Lessons in how to search.
Students via bikeriderlondon/www.shutterstock.com

Part of this is developing a critical eye about what’s being searched for online, or “crap-detection”, whilst wading through the deluge of available information. This aspect is vital to any educationally serious notion of information curation, as learners increasingly use the web as extensions of their own memory when searching.

Students must begin by understanding that most online content is already curated by search engines like Google using their PageRank algorithm and other indicators. Curation, therefore, becomes a kind of stewardship of other people’s writing and requires entering into a conversation with the writers of those texts. It is a crucial kind of ‘digital literacy’

Curation has, through pervasive connectivity, found its way into educational contexts. There is now a need to better understand how practices of online searching and the kinds of writing emerging from curation can be incorporated into the way we assess students.

How to assess these new skills

While writing for assessment tends to focus on the production of a student’s own, “authentic” work, it could also take curation practices into account. Take, for example, a project designed as a kind of digital portfolio. This could require students to locate information on a particular question, organise existing web extracts in a digestible and story-like way, acknowledge their sources, and present an argument or thesis.

Solving problems through synthesising large amounts of information, often collaboratively, and engaging in exploratory and problem-solving pursuits (rather than just memorising facts and dates) are key skills in the 21st century, information-based economy. As the London Chamber of Commerce has highlighted, we must make sure young people and graduates enter employment with these skills.

My own research has shown that young people may already be expert curators as part of their everyday internet experience and surreptitious assignment writing strategies. Teachers and lecturers need to explore and understand these practices better, and create learning opportunities and academic assessment tasks around these somewhat “hard to assess” skills.

In an era of informational abundance, educational end-products – the exam or piece of coursework – need to become less about a single student creating an “authentic” text, and more about a certain kind of digital literacy which harnesses the wisdom of the network of information that is available at the click of a button.

The Conversation

Ibrar Bhatt, Senior Research Associate, Lancaster University

This article was originally published on The Conversation. Read the original article.