CES moves into Canada with Toronto school


travel_log_hanlans_201807161.jpg

The institution will be renamed CES Toronto and will undergo a “slow rebranding,” CES managing director Justin Quinn told The PIE News.

The group has acquired 100% shareholding from the previous owner and president of Global Village Toronto, Genevieve Bouchard, who has recently retired.

Former director of Global Village marketing Robin Adams has been appointed president of the newly-named centre, but the rest of the staff will remain the same.

“[Canada] is a very exciting market to be present in”

The move is the result of CES’ long-time interest in the Canadian market, which Quinn said he had been observing closely, waiting for the right opportunity.

“My interest in Canada has been there for a number of years, I have been watching the market very carefully and it’s a very exciting market to be present in,” he said.

“I was just waiting for the right opportunity to come along.”

CES plans to further grow and develop the school, with a view to introducing new programs, including teacher training. Quinn also hopes the school will become an Eaquals member within the next 12 months. He is the current chair of the accreditation and membership body for language schools.

“One of the things that attracted me to the school is that there is growth potential, and I certainly feel we could probably be more aggressive in our growth strategy,” Quinn explained.

The school will undertake the process to maintain its Languages Canada membership, he added.

“It’s part of the conditions – [Languages Canada] will do due diligence on us as well. It’s an impressive accreditation system,” Quinn told The PIE.

“They look at the owners, and our strategy, and our plans going forward, rather than just looking at how the school is operating at a snapshot in time.”

In a statement, CES and Global Village (both IALC accredited) said they will jointly promote their locations, which include GV Calgary, CES Dublin, CES Edinburgh, CES Harrogate, GV Hawaii, CES Leeds, CES London, CES Oxford, CES Toronto, GV Vancouver, GV Victoria, and CES Worthing.

 

Posted on the Pie News  by Claudia Civinini

The Study-Abroad Experience & Cross-Cultural Friendships


studying #2

”It is important that academicians and educational administrators incorporate sufficient scope for students to make cross-cultural friendships”

“Living abroad should mean loving abroad,” said Marina Meijer, a TEDx Talks speaker, defending studying abroad. The study-abroad experience is more than just about studying – It is about learning. It makes students richer and well-equipped individuals.

Following the heart and heading out into international terrain is a challenge. Let’s not discount prejudice, cultural gaps, homesickness, and adjustment issues. Integrating into another community need not be so much of a fuss despite all that. Students have to pick up challenges as they come.

Over the semester, students can develop friendships that help cultural immersion. It is important that academicians and educational administrators incorporate sufficient scope for students to make cross-cultural friendships.

A few benefits of acculturation:

  • Novelty: A variety of people collaborating gives students the opportunity to see problems from a new perspective or offer insights that had not been thought of before.
  • Personal Transformation: As students start to become proficient in one thing, they need encouragement for the next. They might be self-motivated, but having work-groups and course clubs might help them interact with multinationals better. They grow as individuals and it equips them better to set out for their different career paths.
  • Network: Students may tend to make friends with those from their community. International networking helps build creativity and teaches important lessons in teamwork and communication.
  • Empathy: This ability to climb under another’s skin, walk around in their shoes for a bit. Empathy is vital to overcoming prejudice and narrow nationalism. It helps crack difficult people and complex situations at microcosmic levels.
  • Cultural Intelligence: This is a vital skill set to be able to work efficiently and relate to people who do not come from the same background.
  • Experiential Learning: New activities, experiences, and information like learning a new language, visiting a museum, or simply boarding a bus in a foreign land, exposes students to new things. It creates new neural connections that build on each other and create an optimal environment for learning.

friends #1

Engaging in co-curricular and extracurricular activities like clubs, sports, community events, and tutor programs ensures that guest students integrate with the community, interact with people, and understand the way others live. International educational administrators need to incorporate these activities upon or before the arrival of students on campus.

“Investing in students and the educated youth of the nations will ultimately help build meaningful ‘glocal’ friendships”

It is important that international educators develop a plan for the benefit of the overseas student community.

Here’s a things-to-do list for international educators:

  • Establish intercultural platforms like clubs, community events, coffee-house discussions, social events, etc.
  • Host inclusive Model United Nations conferences so students can play delegates and become sensitive to contemporary world issues.
  • Encourage students to talk about issues in their native lands so they form strong opinions and grow in their identity
  • Create educational content that is relevant to international student affairs
  • Develop good hiring and job exchange programs in the global market
  • Enable digital learning and interaction for better access and collaboration of skills and knowledge
  • Assess student performances and provide them with a list of developmental opportunities and programs

While students need to be self-responsible and auto-motivated, educators also need a skill set of competencies to guide students through their educational journey. International educators are major stakeholders in helping students think critically, master a foreign second language, work in multinational teams, facilitate cross-cultural communication, widen job horizons, and in improving access and opportunities to thrive.

Investing in students and the educated youth of the nations will ultimately help build meaningful ‘glocal’ friendships. Innovation, progressive thought, acceptance, and other human values will follow as consequence. It is a long-term investment in the human race which will engineer a generation for a better tomorrow.

This article first appeared on July 12, 2018 at https://blog.thepienews.com/2018/07/the-study-abroad-experience-cross-cultural-friendships/

About the author: Ethan Miller is an online ESL instructor and EdTech enthusiast based in Illinois.

Language puts ordinary people at a disadvantage in the criminal justice system


File 20170817 13465 1lhwsd6
‘Now, did you understand all that?’
Shutterstock

David Wright, Nottingham Trent University

Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.

Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.

But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.

The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.

It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:

You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.

This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.

What the research says

Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.

This issue of comprehensibility is compounded, of course, when the detainee is not a native speaker of English.

The word of the law.
Shutterstock

The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.

However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.

Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.

The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.

Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.

The ConversationLanguage will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.

David Wright, Lecturer in Linguistics, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

English language bar for citizenship likely to further disadvantage refugees


File 20170424 24654 1oq9mss
Prime Minister Malcolm Turnbull has proposed tougher language requirements for new citizenship applicants.
Lukas Coch/AAP

Sally Baker, University of Newcastle and Rachel Burke, University of Newcastle

Citizenship applicants will need to demonstrate a higher level of English proficiency if the government’s proposed changes to the Australian citizenship test go ahead.

Applicants will be required to reach the equivalent of Band 6 proficiency of the International English Language Testing System (IELTS).

To achieve Band 6, applicants must correctly answer 30 out of 40 questions in the reading paper, 23 out of 40 in the listening paper, and the writing paper rewards language used “accurately and appropriately”. If a candidate’s writing has “frequent” inaccuracies in grammar and spelling, they cannot achieve Band 6

Success in IELTS requires proficiency in both the English language, and also understanding how to take – and pass – a test. The proposed changes will then make it harder for people with fragmented educational backgrounds to become citizens, such as many refugees.

How do the tests currently work?

The current citizenship test consists of 20 multiple-choice questions in English concerning Australia’s political system, history, and citizen responsibilities.

While the test does not require demonstration of English proficiency per se, it acts as an indirect assessment of language.

For example, the question: “Which official symbol of Australia identifies Commonwealth property?” demonstrates the level of linguistic complexity required.

The IELTS test is commonly taken for immigration purposes as a requirement for certain visa categories; however, the designer of IELTS argues that IELTS was never designed for this purpose. Researchers have argued that the growing strength of English as the language of politics and economics has resulted in its widespread use for immigration purposes.

Impact of proposed changes

English is undoubtedly important for participation in society, but deciding citizenship based on a high-stakes language test could further marginalise community members, such as people with refugee backgrounds who have the greatest need for citizenship, yet lack the formal educational background to navigate such tests.

The Refugee Council of Australia argues that adults with refugee backgrounds will be hardest hit by the proposed language test.

Data shows that refugees are both more likely to apply for citizenship, and twice as likely as other migrant groups to have to retake the test.

Mismatched proficiency expectations

The Adult Migrant English Program (AMEP), where many adult refugees access English learning upon arrival, expects only a “functional” level of language proficiency.

For many adult refugees – who have minimal first language literacy, fragmented educational experiences, and limited opportunities to gain feedback on their written English – “competency” may be prohibitive to gaining citizenship. This is also more likely to impact refugee women, who are less likely to have had formal schooling and more likely to assume caring duties.

Bar too high?

The challenges faced in re/settlement contexts, such as pressures of work and financial responsibilities to extended family, often combine to make learning a language difficult, and by extension,
prevent refugees from completing the citizenship test.

Similar patterns are evident with IELTS. Nearly half of Arabic speakers who took the IELTS in 2015 scored lower than Band 6.

There are a number of questions to clarify regarding the proposed language proficiency test:

  • Will those dealing with trauma-related experiences gain exemption from a high-stakes, time-pressured examination?
  • What support mechanisms will be provided to assist applicants to study for the test?
  • Will financially-disadvantaged members of the community be expected to pay for classes/ materials in order to prepare for the citizenship test?
  • The IELTS test costs A$330, with no subsidies available. Will the IELTS-based citizenship/ language test attract similar fees?

There are also questions about the fairness of requiring applicants to demonstrate a specific type and level of English under examination conditions that is not required of all citizens. Those born in Australia are not required to pass an academic test of language in order to retain their citizenship.

Recognising diversity of experiences

There are a few things the government should consider before introducing a language test:

1) Community consultation is essential. Input from community/ migrant groups, educators, and language assessment specialists will ensure the test functions as a valid evaluation of progression towards English language proficiency. The government is currently calling for submissions related to the new citizenship test.

2) Design the test to value different forms and varieties of English that demonstrate progression in learning rather than adherence to prescriptive standards.

3) Provide educational opportunities that build on existing linguistic strengths that help people to prepare for the test.

The ConversationEquating a particular type of language proficiency with a commitment to Australian citizenship is a complex and ideologically-loaded notion. The government must engage in careful consideration before potentially further disadvantaging those most in need of citizenship.

Sally Baker, Research Associate, Centre of Excellence for Equity in Higher Education, University of Newcastle and Rachel Burke, Lecturer, University of Newcastle

This article was originally published on The Conversation. Read the original article.

Is there such a thing as a national sense of humour?


File 20170503 21630 1p64l4e
A statue celebrating Monty Python’s sketch The Dead Parrot near London’s Tower Bridge ahead of a live show on the TV channel Gold.
DAVID HOLT/Flickr, CC BY-SA

Gary McKeown, Queen’s University Belfast

We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual? The Conversation

There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.

Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.

Just for laughs.

These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.

Language and culture

Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.

Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.

Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.

International humour?
C.P.Storm/Flickr, CC BY-SA

Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.

Denigration and self-deprecation

There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.

Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.

Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.

Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.

Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.

‘Fork handles’.

For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.

These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.

A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.

Gary McKeown, Senior Lecturer of Psychology, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Accessible, engaging textbooks could improve children’s learning


Image 20170313 9408 bb6pp1
It’s not enough for textbooks just to be present in a classroom. They must support learning.
Global Partnership for Education/Flickr, CC BY-NC-ND

Lizzi O. Milligan, University of Bath

Textbooks are a crucial part of any child’s learning. A large body of research has proved this many times and in many very different contexts. Textbooks are a physical representation of the curriculum in a classroom setting. They are powerful in shaping the minds of children and young people. The Conversation

UNESCO has recognised this power and called for every child to have a textbook for every subject. The organisation argues that

next to an engaged and prepared teacher, well-designed textbooks in sufficient quantities are the most effective way to improve instruction and learning.

But there’s an elephant in the room when it comes to textbooks in African countries’ classrooms: language.

Rwanda is one of many African countries that’s adopted a language instruction policy which sees children learning in local or mother tongue languages for the first three years of primary school. They then transition in upper primary and secondary school into a dominant, so-called “international” language. This might be French or Portuguese. In Rwanda, it has been English since 2008.

Evidence from across the continent suggests that at this transition point, many learners have not developed basic literacy and numeracy skills. And, significantly, they have not acquired anywhere near enough of the language they are about to learn in to be able to engage in learning effectively.

I do not wish to advocate for English medium instruction, and the arguments for mother-tongue based education are compelling. But it’s important to consider strategies for supporting learners within existing policy priorities. Using appropriate learning and teaching materials – such as textbooks – could be one such strategy.

A different approach

It’s not enough to just hand out textbooks in every classroom. The books need to tick two boxes: learners must be able to read them and teachers must feel enabled to teach with them.

Existing textbooks tend not to take these concerns into consideration. The language is too difficult and the sentence structures too complex. The paragraphs too long and there are no glossaries to define unfamiliar words. And while textbooks are widely available to those in the basic education system, they are rarely used systematically. Teachers cite the books’ inaccessibility as one of the main reasons for not using them.

A recent initiative in Rwanda has sought to address this through the development of “language supportive” textbooks for primary 4 learners who are around 11 years old. These were specifically designed in collaboration with local publishers, editors and writers.

Language supportive textbooks have been shown to make a difference in some Rwandan classrooms.

There are two key elements to a “language supportive” textbook.

Firstly, they are written at a language level which is appropriate for the learner. As can be seen in Figure 1, the new concept is introduced in as simple English as possible. The sentence structure and paragraph length are also shortened and made as simple as possible. The key word (here, “soil”) is also repeated numerous times so that the learner becomes accustomed to this word.

University of Bristol and the British Council

Secondly, they include features – activities, visuals, clear signposting and vocabulary support – that enable learners to practice and develop their language proficiency while learning the key elements of the curriculum.

The books are full of relevant activities that encourage learners to regularly practice their listening, speaking, reading and writing of English in every lesson. This enables language development.

Crucially, all of these activities are made accessible to learners – and teachers – by offering support in the learners’ first language. In this case, the language used was Kinyarwanda, which is the first language for the vast majority of Rwandan people. However, it’s important to note that initially many teachers were hesitant about incorporating Kinyarwanda into their classroom practice because of the government’s English-only policy.

Improved test scores

The initiative was introduced with 1075 students at eight schools across four Rwandan districts. The evidence from our initiative suggests that learners in classrooms where these books were systematically used learnt more across the curriculum.

When these learners sat tests before using the books, they scored similar results to those in other comparable schools. After using the materials for four months, their test scores were significantly higher. Crucially, both learners and teachers pointed out how important it was that the books sanctioned the use of Kinyarwanda. The classrooms became bilingual spaces and this increased teachers’ and learners’ confidence and competence.

All of this supports the importance of textbooks as effective learning and teaching materials in the classroom and shows that they can help all learners. But authorities mustn’t assume that textbooks are being used or that the existing books are empowering teachers and learners.

Textbooks can matter – but it’s only when consideration is made for the ways they can help all learners that we can say that they can contribute to quality education for all.

Lizzi O. Milligan, Lecturer in International Education, University of Bath

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Tough immigration laws are hitting Britain’s curry houses hard


Image 20170223 32705 au4loj
shutterstock

Emily Falconer, University of Westminster

The British curry industry is responsible for 100,000 jobs and contributes more than £4 billion to the UK economy. But it’s now feared that up to a third of all Indian restaurants could disappear because of tougher immigration laws. The Conversation

The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.

These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.

Fusion food

Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.

It’s been predicted that more than half of all curry houses may shut down within ten years.
Shutterstock

“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.

As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.

In his book Spicing Up Britain: The Multicultural History of British Food, Panayai charts the patterns of migration and the influences of food, taste and consumption habits. He follows the tastes of British Asians who have grown up with a fusion of tastes and influences all their life.

These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.

Merging of tastes

Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.

In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.

Fish and chips and curry sauce – a British speciality.
Flickr/Liz Barker, CC BY-NC

These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.

These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.

More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.

‘One spicy crab coming right up’.
Shutterstock

The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.

Multicultural Britain

But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.

As diverse as the food on our plates.
Shutterstock

This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.

Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.

Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.

Emily Falconer, Lecturer in Sociology, University of Westminster

This article was originally published on The Conversation. Read the original article.

The world’s words of the year pass judgement on a dark, surreal 2016


efef.JPG

Philip Seargeant, The Open University

Every December, lexicographers around the world choose their “words of the year”, and this year, perhaps more than ever, the stories these tell provide a fascinating insight into how we’ve experienced the drama and trauma of the last 12 months.

There was much potential in 2016. It was 500 years ago that Thomas More wrote his Utopia, and January saw the launch of a year’s celebrations under the slogan “A Year of Imagination and Possibility” – but as 2017 looms, this slogan rings hollow. Instead of utopian dreams, we’ve had a year of “post-truth” and “paranoia”, of “refugee” crises, “xenophobia” and a close shave with “fascism”.

Earlier in the year, a campaign was launched to have “Essex Girl” removed from the Oxford English Dictionary (OED). Those behind the campaign were upset at the derogatory definition – a young woman “characterised as unintelligent, promiscuous, and materialistic” – so wanted it to be expunged from the official record of the language.

The OED turned down the request, a spokeswoman explaining that since the OED is a historical dictionary, nothing is ever removed; its purpose, she said, is to describe the language as people use it, and to stand as a catalogue of the trends and preoccupations of the time.

The words of the year tradition began with the German Wort des Jahres in the 1970s. It has since spread to other languages, and become increasingly popular the world over. Those in charge of the choices are getting more innovative: in 2015, for the first time, Oxford Dictionaries chose a pictograph as their “word”: the emoji for “Face with Tears of Joy”.

In 2016, however, the verbal was very much back in fashion. The results speak volumes.

Dark days

In English, there are a range of competing words, with all the major dictionaries making their own choices. Having heralded a post-language era last year, Oxford Dictionaries decided on “post-truth” this time, defining it as the situation when “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In a year of evidence-light Brexit promises and Donald Trump’s persistent lies and obfuscations, this has a definite resonance. In the same dystopian vein, the Cambridge Dictionary chose “paranoid”, while Dictionary.com went for “xenophobia”.

Merriam-Webster valiantly tried to turn back the tide of pessimism. When “fascism” looked set to win its online poll, it tweeted its readers imploring them to get behind something – anything – else. The plea apparently worked, and in the end “surreal” won the day. Apt enough for a year in which events time and again almost defied belief.

The referendum that spawned a thousand words.
EPA/Andy Rain

Collins, meanwhile, chose “Brexit”, a term which its spokesperson suggested has become as flexible and influential in political discourse as “Watergate”.

Just as the latter spawned hundreds of portmanteau words whenever a political scandal broke, so Brexit begat “Bremain”, “Bremorse” and “Brexperts” – and will likely be adapted for other upcoming political rifts for many years to come. It nearly won out in Australia in fact, where “Ausexit” (severing ties with the British monarchy or the United Nations) was on the shortlist. Instead, the Australian National Dictionary went for “democracy sausage” – the tradition of eating a barbecued sausage on election day.

Around the world, a similar pattern of politics and apprehension emerges. In France, the mot de l’année was réfugiés (refugees); and in Germany postfaktisch, meaning much the same as “post-truth”. Swiss German speakers, meanwhile, went for Filterblase (filter bubble), the idea that social media is creating increasingly polarised political communities.

Switzerland’s Deaf Association, meanwhile, chose a Sign of the Year for the first time. Its choice was “Trump”, consisting of a gesture made by placing an open palm on the top of the head, mimicking the president-elect’s extravagant hairstyle.

2016’s golden boy, as far as Japan’s concerned.
Albert H. Teich

Trump’s hair also featured in Japan’s choice for this year. Rather than a word, Japan chooses a kanji (Chinese character); 2016’s choice is “金” (gold). This represented a number of different topical issues: Japan’s haul of medals at the Rio Olympics, fluctuating interest rates, the gold shirt worn by singer and YouTube sensation Piko Taro, and, inevitably, the colour of Trump’s hair.

And then there’s Austria, whose word is 51 letters long: Bundespräsidentenstichwahlwiederholungsverschiebung. It means “the repeated postponement of the runoff vote for Federal President”. Referring to the seven months of votes, legal challenges and delays over the country’s presidential election, this again references an event that flirted with extreme nationalism and exposed the convoluted nature of democracy. As a new coinage, it also illustrates language’s endless ability to creatively grapple with unfolding events.

Which brings us, finally, to “unpresidented”, a neologism Donald Trump inadvertently created when trying to spell “unprecedented” in a tweet attacking the Chinese. At the moment, it’s a word in search of a meaning, but the possibilities it suggests seem to speak perfectly to the history of the present moment. And depending on what competitors 2017 throws up, it could well emerge as a future candidate.

The Conversation

Philip Seargeant, Senior Lecturer in Applied Linguistics, The Open University

This article was originally published on The Conversation. Read the original article.

Things you were taught at school that are wrong


Capture.JPG

Misty Adoniou, University of Canberra

Do you remember being taught you should never start your sentences with “And” or “But”?

What if I told you that your teachers were wrong and there are lots of other so-called grammar rules that we’ve probably been getting wrong in our English classrooms for years?

How did grammar rules come about?

To understand why we’ve been getting it wrong, we need to know a little about the history of grammar teaching.

Grammar is how we organise our sentences in order to communicate meaning to others.

Those who say there is one correct way to organise a sentence are called prescriptivists. Prescriptivist grammarians prescribe how sentences must be structured.

Prescriptivists had their day in the sun in the 18th century. As books became more accessible to the everyday person, prescriptivists wrote the first grammar books to tell everyone how they must write.

These self-appointed guardians of the language just made up grammar rules for English, and put them in books that they sold. It was a way of ensuring that literacy stayed out of reach of the working classes.

They took their newly concocted rules from Latin. This was, presumably, to keep literate English out of reach of anyone who wasn’t rich or posh enough to attend a grammar school, which was a school where you were taught Latin.

And yes, that is the origin of today’s grammar schools.

The other camp of grammarians are the descriptivists. They write grammar guides that describe how English is used by different people, and for different purposes. They recognise that language isn’t static, and it isn’t one-size-fits-all.

1. You can’t start a sentence with a conjunction

Let’s start with the grammatical sin I have already committed in this article. You can’t start a sentence with a conjunction.

Obviously you can, because I did. And I expect I will do it again before the end of this article. There, I knew I would!

Those who say it is always incorrect to start a sentence with a conjunction, like “and” or “but”, sit in the prescriptivist camp.

However, according to the descriptivists, at this point in our linguistic history,
it is fine to start a sentence with a conjunction in an op-ed article like this, or in a novel or a poem.

It is less acceptable to start a sentence with a conjunction in an academic journal article, or in an essay for my son’s high school economics teacher, as it turns out. But times are changing.

2. You can’t end a sentence with a preposition

Well, in Latin you can’t. In English you can, and we do all the time.

Admittedly a lot of the younger generation don’t even know what a preposition is, so this rule is already obsolete. But let’s have a look at it anyway, for old time’s sake.

According to this rule, it is wrong to say “Who did you go to the movies with?”

Instead, the prescriptivists would have me say “With whom did you go to the movies?”

I’m saving that structure for when I’m making polite chat with the Queen on my next visit to the palace.

That’s not a sarcastic comment, just a fanciful one. I’m glad I know how to structure my sentences for different audiences. It is a powerful tool. It means I usually feel comfortable in whatever social circumstances I find myself in, and I can change my writing style according to purpose and audience.

That is why we should teach grammar in schools. We need to give our children a full repertoire of language so that they can make grammatical choices that will allow them to speak and write for a wide range of audiences.

3. Put a comma when you need to take a breath

It’s a novel idea, synchronising your writing with your breathing, but the two have nothing to do with one another and if this is the instruction we give our children, it is little wonder commas are so poorly used.

Punctuation is a minefield and I don’t want to risk blowing up the internet. So here is a basic description of what commas do, and read this for a more comprehensive guide.

Commas provide demarcation between like grammatical structures. When adjectives, nouns, phrases or clauses are butting up against each other in a sentence, we separate them with a comma. That’s why I put commas between the three nouns and the two clauses in that last sentence.

Commas also provide demarcation for words, phrases or clauses that are embedded in a sentence for effect. The sentence would still be a sentence even if we took those words away. See, for example, the use of commas in this sentence.

4. To make your writing more descriptive, use more adjectives

American writer Mark Twain had it right.

“When you catch an adjective, kill it. No, I don’t mean utterly, but kill most of them – then the rest will be valuable.”

If you want your writing to be more descriptive, play with your sentence structure.

Consider this sentence from Liz Lofthouse’s beautiful children’s book Ziba came on a boat. It comes at a key turning point in the book, the story of a refugee’s escape.

“Clutching her mother’s hand, Ziba ran on and on, through the night, far away from the madness until there was only darkness and quiet.”

A beautifully descriptive sentence, and not an adjective in sight.

5. Adverbs are the words that end in ‘ly’

Lots of adverbs end in “ly”, but lots don’t.

Adverbs give more information about verbs. They tell us when, where, how and why the verb happened. So that means words like “tomorrow”, “there” and “deep” can be adverbs.

I say they can be adverbs because, actually, a word is just a word. It becomes an adverb, or a noun, or an adjective, or a verb when it is doing that job in a sentence.

Deep into the night, and the word deep is an adverb. Down a deep, dark hole and it is an adjective. When I dive into the deep, it is doing the work of a noun.

Time to take those word lists of adjectives, verbs and nouns off the classroom walls.

Time, also, to ditch those old Englishmen who wrote a grammar for their times, not ours.

If you want to understand what our language can do and how to use it well, read widely, think deeply and listen carefully. And remember, neither time nor language stands still – for any of us.

The Conversation

Misty Adoniou, Associate Professor in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.