The institution will be renamed CES Toronto and will undergo a “slow rebranding,” CES managing director Justin Quinn told The PIE News.
The group has acquired 100% shareholding from the previous owner and president of Global Village Toronto, Genevieve Bouchard, who has recently retired.
Former director of Global Village marketing Robin Adams has been appointed president of the newly-named centre, but the rest of the staff will remain the same.
“[Canada] is a very exciting market to be present in”
The move is the result of CES’ long-time interest in the Canadian market, which Quinn said he had been observing closely, waiting for the right opportunity.
“My interest in Canada has been there for a number of years, I have been watching the market very carefully and it’s a very exciting market to be present in,” he said.
“I was just waiting for the right opportunity to come along.”
CES plans to further grow and develop the school, with a view to introducing new programs, including teacher training. Quinn also hopes the school will become an Eaquals member within the next 12 months. He is the current chair of the accreditation and membership body for language schools.
“One of the things that attracted me to the school is that there is growth potential, and I certainly feel we could probably be more aggressive in our growth strategy,” Quinn explained.
The school will undertake the process to maintain its Languages Canada membership, he added.
“It’s part of the conditions – [Languages Canada] will do due diligence on us as well. It’s an impressive accreditation system,” Quinn told The PIE.
“They look at the owners, and our strategy, and our plans going forward, rather than just looking at how the school is operating at a snapshot in time.”
In a statement, CES and Global Village (both IALC accredited) said they will jointly promote their locations, which include GV Calgary, CES Dublin, CES Edinburgh, CES Harrogate, GV Hawaii, CES Leeds, CES London, CES Oxford, CES Toronto, GV Vancouver, GV Victoria, and CES Worthing.
Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.
In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.
In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).
A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.
A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.
But it’s not just a case of under-representation. It’s also a case of misrepresentation.
Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.
These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.
Against these exclusions, leading British authors such as Bernardine Evaristoand others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.
Reframing the narrative
The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.
For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.
Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.
Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.
There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.
All literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.
There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.
Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.
Why is foreign language study important at the university level?
As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.
This happens in two important ways.
The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”
The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”
Gaining cross-cultural understanding
Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.
Psychologist Robert Sternberg’sresearch on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.
Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.
Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”
With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.
Dealing with the unknown
The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”
Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.
It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.
Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.
Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).
In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.
Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.
Language learning in higher ed
Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.
In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.
I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.
Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,
“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”
Considering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”
Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.
Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.
But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.
The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.
It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:
You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.
This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.
What the research says
Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.
The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.
However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.
Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.
The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.
Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.
Language will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.
Citizenship applicants will need to demonstrate a higher level of English proficiency if the government’s proposed changes to the Australian citizenship test go ahead.
Applicants will be required to reach the equivalent of Band 6 proficiency of the International English Language Testing System (IELTS).
To achieve Band 6, applicants must correctly answer 30 out of 40 questions in the reading paper, 23 out of 40 in the listening paper, and the writing paper rewards language used “accurately and appropriately”. If a candidate’s writing has “frequent” inaccuracies in grammar and spelling, they cannot achieve Band 6
Success in IELTS requires proficiency in both the English language, and also understanding how to take – and pass – a test. The proposed changes will then make it harder for people with fragmented educational backgrounds to become citizens, such as many refugees.
How do the tests currently work?
The current citizenship test consists of 20 multiple-choice questions in English concerning Australia’s political system, history, and citizen responsibilities.
While the test does not require demonstration of English proficiency per se, it acts as an indirect assessment of language.
For example, the question: “Which official symbol of Australia identifies Commonwealth property?” demonstrates the level of linguistic complexity required.
The IELTS test is commonly taken for immigration purposes as a requirement for certain visa categories; however, the designer of IELTS argues that IELTS was never designed for this purpose. Researchers have argued that the growing strength of English as the language of politics and economics has resulted in its widespread use for immigration purposes.
For many adult refugees – who have minimal first language literacy, fragmented educational experiences, and limited opportunities to gain feedback on their written English – “competency” may be prohibitive to gaining citizenship. This is also more likely to impact refugee women, who are less likely to have had formal schooling and more likely to assume caring duties.
There are a number of questions to clarify regarding the proposed language proficiency test:
Will those dealing with trauma-related experiences gain exemption from a high-stakes, time-pressured examination?
What support mechanisms will be provided to assist applicants to study for the test?
Will financially-disadvantaged members of the community be expected to pay for classes/ materials in order to prepare for the citizenship test?
The IELTS test costs A$330, with no subsidies available. Will the IELTS-based citizenship/ language test attract similar fees?
There are also questions about the fairness of requiring applicants to demonstrate a specific type and level of English under examination conditions that is not required of all citizens. Those born in Australia are not required to pass an academic test of language in order to retain their citizenship.
Recognising diversity of experiences
There are a few things the government should consider before introducing a language test:
1) Community consultation is essential. Input from community/ migrant groups, educators, and language assessment specialists will ensure the test functions as a valid evaluation of progression towards English language proficiency. The government is currently calling for submissions related to the new citizenship test.
2) Design the test to value different forms and varieties of English that demonstrate progression in learning rather than adherence to prescriptive standards.
3) Provide educational opportunities that build on existing linguistic strengths that help people to prepare for the test.
Equating a particular type of language proficiency with a commitment to Australian citizenship is a complex and ideologically-loaded notion. The government must engage in careful consideration before potentially further disadvantaging those most in need of citizenship.
We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual?
There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.
Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.
These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.
Language and culture
Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.
Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.
Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.
Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.
Denigration and self-deprecation
There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.
Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.
Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.
Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.
Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.
For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.
These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.
A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.
Textbooks are a crucial part of any child’s learning. A large body of research has proved this many times and in many very different contexts. Textbooks are a physical representation of the curriculum in a classroom setting. They are powerful in shaping the minds of children and young people.
UNESCO has recognised this power and called for every child to have a textbook for every subject. The organisation argues that
next to an engaged and prepared teacher, well-designed textbooks in sufficient quantities are the most effective way to improve instruction and learning.
But there’s an elephant in the room when it comes to textbooks in African countries’ classrooms: language.
Rwanda is one of many African countries that’s adopted a language instruction policy which sees children learning in local or mother tongue languages for the first three years of primary school. They then transition in upper primary and secondary school into a dominant, so-called “international” language. This might be French or Portuguese. In Rwanda, it has been Englishsince 2008.
Evidence from across the continent suggests that at this transition point, many learners have not developed basic literacy and numeracy skills. And, significantly, they have not acquired anywhere near enough of the language they are about to learn in to be able to engage in learning effectively.
I do not wish to advocate for English medium instruction, and the arguments for mother-tongue based education are compelling. But it’s important to consider strategies for supporting learners within existing policy priorities. Using appropriate learning and teaching materials – such as textbooks – could be one such strategy.
A different approach
It’s not enough to just hand out textbooks in every classroom. The books need to tick two boxes: learners must be able to read them and teachers must feel enabled to teach with them.
Existing textbooks tend not to take these concerns into consideration. The language is too difficult and the sentence structures too complex. The paragraphs too long and there are no glossaries to define unfamiliar words. And while textbooks are widely available to those in the basic education system, they are rarely used systematically. Teachers cite the books’ inaccessibility as one of the main reasons for not using them.
A recent initiative in Rwanda has sought to address this through the development of “language supportive” textbooks for primary 4 learners who are around 11 years old. These were specifically designed in collaboration with local publishers, editors and writers.
There are two key elements to a “language supportive” textbook.
Firstly, they are written at a language level which is appropriate for the learner. As can be seen in Figure 1, the new concept is introduced in as simple English as possible. The sentence structure and paragraph length are also shortened and made as simple as possible. The key word (here, “soil”) is also repeated numerous times so that the learner becomes accustomed to this word.
Secondly, they include features – activities, visuals, clear signposting and vocabulary support – that enable learners to practice and develop their language proficiency while learning the key elements of the curriculum.
The books are full of relevant activities that encourage learners to regularly practice their listening, speaking, reading and writing of English in every lesson. This enables language development.
Crucially, all of these activities are made accessible to learners – and teachers – by offering support in the learners’ first language. In this case, the language used was Kinyarwanda, which is the first language for the vast majority of Rwandan people. However, it’s important to note that initially many teachers were hesitant about incorporating Kinyarwanda into their classroom practice because of the government’s English-only policy.
Improved test scores
The initiative was introduced with 1075 students at eight schools across four Rwandan districts. The evidence from our initiative suggests that learners in classrooms where these books were systematically used learnt more across the curriculum.
When these learners sat tests before using the books, they scored similar results to those in other comparable schools. After using the materials for four months, their test scores were significantly higher. Crucially, both learners and teachers pointed out how important it was that the books sanctioned the use of Kinyarwanda. The classrooms became bilingual spaces and this increased teachers’ and learners’ confidence and competence.
All of this supports the importance of textbooks as effective learning and teaching materials in the classroom and shows that they can help all learners. But authorities mustn’t assume that textbooks are being used or that the existing books are empowering teachers and learners.
Textbooks can matter – but it’s only when consideration is made for the ways they can help all learners that we can say that they can contribute to quality education for all.
It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned.
The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.
Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.
Why younger may not always be better
Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.
The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.
The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.
Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.
In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.
Language immersion environment best for young children
Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.
Learning in classroom best for early teens
Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.
To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.
For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.
Self-guided learning best for adults
There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.
To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.
How we can apply this to education
What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.
If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.
However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.
The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.
These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.
Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.
“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.
As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.
These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.
Merging of tastes
Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.
In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.
These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.
These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.
More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.
The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.
But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.
This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.
Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.
Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.
Every December, lexicographers around the world choose their “words of the year”, and this year, perhaps more than ever, the stories these tell provide a fascinating insight into how we’ve experienced the drama and trauma of the last 12 months.
There was much potential in 2016. It was 500 years ago that Thomas More wrote his Utopia, and January saw the launch of a year’s celebrations under the slogan “A Year of Imagination and Possibility” – but as 2017 looms, this slogan rings hollow. Instead of utopian dreams, we’ve had a year of “post-truth” and “paranoia”, of “refugee” crises, “xenophobia” and a close shave with “fascism”.
Earlier in the year, a campaign was launched to have “Essex Girl” removed from the Oxford English Dictionary (OED). Those behind the campaign were upset at the derogatory definition – a young woman “characterised as unintelligent, promiscuous, and materialistic” – so wanted it to be expunged from the official record of the language.
The OED turned down the request, a spokeswoman explaining that since the OED is a historical dictionary, nothing is ever removed; its purpose, she said, is to describe the language as people use it, and to stand as a catalogue of the trends and preoccupations of the time.
The words of the year tradition began with the German Wort des Jahres in the 1970s. It has since spread to other languages, and become increasingly popular the world over. Those in charge of the choices are getting more innovative: in 2015, for the first time, Oxford Dictionaries chose a pictograph as their “word”: the emoji for “Face with Tears of Joy”.
In 2016, however, the verbal was very much back in fashion. The results speak volumes.
In English, there are a range of competing words, with all the major dictionaries making their own choices. Having heralded a post-language era last year, Oxford Dictionaries decided on “post-truth” this time, defining it as the situation when “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In a year of evidence-light Brexit promises and Donald Trump’s persistent lies and obfuscations, this has a definite resonance. In the same dystopian vein, the Cambridge Dictionary chose “paranoid”, while Dictionary.com went for “xenophobia”.
Merriam-Webster valiantly tried to turn back the tide of pessimism. When “fascism” looked set to win its online poll, it tweeted its readers imploring them to get behind something – anything – else. The plea apparently worked, and in the end “surreal” won the day. Apt enough for a year in which events time and again almost defied belief.
Collins, meanwhile, chose “Brexit”, a term which its spokesperson suggested has become as flexible and influential in political discourse as “Watergate”.
Just as the latter spawned hundreds of portmanteau words whenever a political scandal broke, so Brexit begat “Bremain”, “Bremorse” and “Brexperts” – and will likely be adapted for other upcoming political rifts for many years to come. It nearly won out in Australia in fact, where “Ausexit” (severing ties with the British monarchy or the United Nations) was on the shortlist. Instead, the Australian National Dictionary went for “democracy sausage” – the tradition of eating a barbecued sausage on election day.
Switzerland’s Deaf Association, meanwhile, chose a Sign of the Year for the first time. Its choice was “Trump”, consisting of a gesture made by placing an open palm on the top of the head, mimicking the president-elect’s extravagant hairstyle.
Trump’s hair also featured in Japan’s choice for this year. Rather than a word, Japan chooses a kanji (Chinese character); 2016’s choice is “金” (gold). This represented a number of different topical issues: Japan’s haul of medals at the Rio Olympics, fluctuating interest rates, the gold shirt worn by singer and YouTube sensation Piko Taro, and, inevitably, the colour of Trump’s hair.
And then there’s Austria, whose word is 51 letters long: “Bundespräsidentenstichwahlwiederholungsverschiebung”. It means “the repeated postponement of the runoff vote for Federal President”. Referring to the seven months of votes, legal challenges and delays over the country’s presidential election, this again references an event that flirted with extreme nationalism and exposed the convoluted nature of democracy. As a new coinage, it also illustrates language’s endless ability to creatively grapple with unfolding events.
Which brings us, finally, to “unpresidented”, a neologism Donald Trump inadvertently created when trying to spell “unprecedented” in a tweet attacking the Chinese. At the moment, it’s a word in search of a meaning, but the possibilities it suggests seem to speak perfectly to the history of the present moment. And depending on what competitors 2017 throws up, it could well emerge as a future candidate.
by Gianfranco Conti, PhD. Co-author of 'The Language Teacher toolkit' and "Breaking the sound barrier: teaching learners how to listen', winner of the 2015 TES best resource contributor award and founder of www.language-gym.com