British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

How learning a new language improves tolerance


Image 20161208 31364 1yz4g47
Why learn a new language?
Timothy Vollmer, CC BY

Amy Thompson, University of South Florida

There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.

Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.

Unfortunately, not all American universities consider learning foreign languages a worthwhile investment.

Why is foreign language study important at the university level?

As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.

This happens in two important ways.

The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”

The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”

Gaining cross-cultural understanding

Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.

Psychologist Robert Sternberg’s research on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.

Learning a foreign language reduces social anxiety.
COD Newsroom, CC BY

Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.

Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”

With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.

Dealing with the unknown

The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”

Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.

It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.

Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.

What changes with this understanding

A high tolerance of ambiguity brings many advantages. It helps students become less anxious in social interactions and in subsequent language learning experiences. Not surprisingly, the more experience a person has with language learning, the more comfortable the person gets with this ambiguity.

And that’s not all.

Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).

In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.

Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.

Language learning in higher ed

Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.

Why more universities should teach a foreign language.
sarspri, CC BY-NC

In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.

I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.

Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,

“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”

The ConversationConsidering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”

Amy Thompson, Associate Professor of Applied Linguistics, University of South Florida

This article was originally published on The Conversation. Read the original article.

 

Language puts ordinary people at a disadvantage in the criminal justice system


File 20170817 13465 1lhwsd6
‘Now, did you understand all that?’
Shutterstock

David Wright, Nottingham Trent University

Language is pervasive throughout the criminal justice system. A textual chain follows a person from the moment they are arrested until their day in court, and it is all underpinned by meticulously drafted legislation. At every step, there are challenges faced by laypeople who find themselves in the linguistic webs of the justice system.

Anyone who reads a UK act of parliament, for example, is met with myriad linguistic complexities. Archaic formulae, complex prepositions, lengthy and embedded clauses abound in the pages of the law. Such language can render legal texts inaccessible to the everyday reader. Some argue (see Vijay Bhatia’s chapter) that this is a deliberate ploy by the legal establishment to keep the non-expert at an arm’s length.

But closer to the truth is the fact that legal language, like all language in all contexts, is the way it is because of its function and purpose. Those drafting laws must ensure enough precision and unambiguity so that the law can be applied, while also being flexible and inclusive enough to account for the unpredictability of human behaviour.

The cost of this linguistic balancing act, however, is increased complexity and the exclusion of the uninitiated. Legal language has long been in the crosshairs of The Plain English Campaign which argues for its simplification, claiming that “if we can’t understand our rights, we have no rights”.

It is not only written legal language that presents difficulties for the layperson. Once someone is arrested they go through a chain of communicative events, each one coloured by institutional language, and each one with implications for the next. It begins with the arresting officer reading the suspect their rights. In England and Wales, the police caution reads:

You do not have to say anything. But, it may harm your defence if you do not mention when questioned something which you later rely on in court. Anything you do say may be given in evidence.

This may seem very familiar to many readers (perhaps due to their penchant for police dramas), but this short set of statements is linguistically complex. The strength of the verb “may”; what exactly constitutes “mentioning” or “relying”, and what “questioning” is and when it will take place, are just some of the ambiguities that may be overlooked at first glance.

What the research says

Indeed, research has found that, although people claim to fully comprehend the caution, they are often incapable of demonstrating any understanding of it at all. Frances Rock has also written extensively on the language of cautioning and found that when police officers explain the caution to detainees in custody, there is substantial variation in the explanations offered. Some explanations add clarity, while others introduce even more puzzles.

This issue of comprehensibility is compounded, of course, when the detainee is not a native speaker of English.

The word of the law.
Shutterstock

The difficulties in understanding legal language are typically overcome by the hiring of legal representation. Peter Tiersma, in his seminal 1999 book Legal Language, noted that “the hope that every man can be his own lawyer, which has existed for centuries, is probably no more realistic than having people be their own doctor”.

However, in the UK at least, cuts in legal aid mean that more people are representing themselves, removing the protection of a legal-language expert. Work by Tatiana Tkacukova has revealed the communicative struggles of these so-called “litigants in person” as they step into the courtroom arena of seasoned legal professionals.

Trained lawyers have developed finely-tuned cross-examination techniques, and all witnesses who take the stand, including the alleged victim or plaintiff, are likely to be subjected to gruelling cross-examination, characterised by coercive and controlling questioning. At best, witnesses might emerge from the courtroom feeling frustrated, and at worst victims may leave feeling victimised once again.

The work of forensic linguists has led to progress in some areas. For instance, it is long established that the cross-examination of alleged rape victims is often underpinned by societal preconceptions and prejudices which, when combined with rigorous questioning, are found to traumatise victims further. Recent reforms in England and Wales provide rape victims with the option to avoid “live” courtroom cross-examination and may go some way towards addressing this issue.

Further afield, an international group of linguists, psychologists, lawyers and interpreters have produced a set of guidelines for communicating rights to non-native speakers of English in Australia, England and Wales, and the US. These guidelines include recommendations for the wording and communication of cautions and rights to detainees, which aim to protect those already vulnerable from further problems of misunderstanding in the justice system.

The ConversationLanguage will forever remain integral to our criminal justice system, and it will continue to disadvantage many who find themselves in the process. However, as the pool and remit of forensic linguists grows, there are greater opportunities to rebalance the linguistic inequalities of the legal system in favour of the layperson.

David Wright, Lecturer in Linguistics, Nottingham Trent University

This article was originally published on The Conversation. Read the original article.

English language bar for citizenship likely to further disadvantage refugees


File 20170424 24654 1oq9mss
Prime Minister Malcolm Turnbull has proposed tougher language requirements for new citizenship applicants.
Lukas Coch/AAP

Sally Baker, University of Newcastle and Rachel Burke, University of Newcastle

Citizenship applicants will need to demonstrate a higher level of English proficiency if the government’s proposed changes to the Australian citizenship test go ahead.

Applicants will be required to reach the equivalent of Band 6 proficiency of the International English Language Testing System (IELTS).

To achieve Band 6, applicants must correctly answer 30 out of 40 questions in the reading paper, 23 out of 40 in the listening paper, and the writing paper rewards language used “accurately and appropriately”. If a candidate’s writing has “frequent” inaccuracies in grammar and spelling, they cannot achieve Band 6

Success in IELTS requires proficiency in both the English language, and also understanding how to take – and pass – a test. The proposed changes will then make it harder for people with fragmented educational backgrounds to become citizens, such as many refugees.

How do the tests currently work?

The current citizenship test consists of 20 multiple-choice questions in English concerning Australia’s political system, history, and citizen responsibilities.

While the test does not require demonstration of English proficiency per se, it acts as an indirect assessment of language.

For example, the question: “Which official symbol of Australia identifies Commonwealth property?” demonstrates the level of linguistic complexity required.

The IELTS test is commonly taken for immigration purposes as a requirement for certain visa categories; however, the designer of IELTS argues that IELTS was never designed for this purpose. Researchers have argued that the growing strength of English as the language of politics and economics has resulted in its widespread use for immigration purposes.

Impact of proposed changes

English is undoubtedly important for participation in society, but deciding citizenship based on a high-stakes language test could further marginalise community members, such as people with refugee backgrounds who have the greatest need for citizenship, yet lack the formal educational background to navigate such tests.

The Refugee Council of Australia argues that adults with refugee backgrounds will be hardest hit by the proposed language test.

Data shows that refugees are both more likely to apply for citizenship, and twice as likely as other migrant groups to have to retake the test.

Mismatched proficiency expectations

The Adult Migrant English Program (AMEP), where many adult refugees access English learning upon arrival, expects only a “functional” level of language proficiency.

For many adult refugees – who have minimal first language literacy, fragmented educational experiences, and limited opportunities to gain feedback on their written English – “competency” may be prohibitive to gaining citizenship. This is also more likely to impact refugee women, who are less likely to have had formal schooling and more likely to assume caring duties.

Bar too high?

The challenges faced in re/settlement contexts, such as pressures of work and financial responsibilities to extended family, often combine to make learning a language difficult, and by extension,
prevent refugees from completing the citizenship test.

Similar patterns are evident with IELTS. Nearly half of Arabic speakers who took the IELTS in 2015 scored lower than Band 6.

There are a number of questions to clarify regarding the proposed language proficiency test:

  • Will those dealing with trauma-related experiences gain exemption from a high-stakes, time-pressured examination?
  • What support mechanisms will be provided to assist applicants to study for the test?
  • Will financially-disadvantaged members of the community be expected to pay for classes/ materials in order to prepare for the citizenship test?
  • The IELTS test costs A$330, with no subsidies available. Will the IELTS-based citizenship/ language test attract similar fees?

There are also questions about the fairness of requiring applicants to demonstrate a specific type and level of English under examination conditions that is not required of all citizens. Those born in Australia are not required to pass an academic test of language in order to retain their citizenship.

Recognising diversity of experiences

There are a few things the government should consider before introducing a language test:

1) Community consultation is essential. Input from community/ migrant groups, educators, and language assessment specialists will ensure the test functions as a valid evaluation of progression towards English language proficiency. The government is currently calling for submissions related to the new citizenship test.

2) Design the test to value different forms and varieties of English that demonstrate progression in learning rather than adherence to prescriptive standards.

3) Provide educational opportunities that build on existing linguistic strengths that help people to prepare for the test.

The ConversationEquating a particular type of language proficiency with a commitment to Australian citizenship is a complex and ideologically-loaded notion. The government must engage in careful consideration before potentially further disadvantaging those most in need of citizenship.

Sally Baker, Research Associate, Centre of Excellence for Equity in Higher Education, University of Newcastle and Rachel Burke, Lecturer, University of Newcastle

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

Tough immigration laws are hitting Britain’s curry houses hard


Image 20170223 32705 au4loj
shutterstock

Emily Falconer, University of Westminster

The British curry industry is responsible for 100,000 jobs and contributes more than £4 billion to the UK economy. But it’s now feared that up to a third of all Indian restaurants could disappear because of tougher immigration laws. The Conversation

The current rules require restaurants that want to employ a chef from outside the EU to pay a minimum salary of £35,000 – or £29,750 with accommodation and food – to secure a visa.

These high costs have meant that many restaurants are unable to hire the skilled chefs they need – which has led to a shortage of top talent – with the ones that are available demanding higher wages. And this combination of rising costs, along with a shortage of chefs means that many curry houses are now facing closure.

Fusion food

Britain has a long, deep relationship with what is widely known as “Indian” food. But food eaten on the Indian subcontinent is so widely diverse, that it has as many differences as it has similarities. Meaning that “Indian” and “curry” is often used as an umbrella term for what is in reality a multifaceted combination of tastes and influences.

It’s been predicted that more than half of all curry houses may shut down within ten years.
Shutterstock

“Indian food” in reality is often derived from particular regions of India, Pakistan, Bangladesh and Sri Lanka as well as across Britain and Europe. And a long and complex history of colonialism and migration has made the “British Curry” a popular national dish.

As the author Panikos Panayai explains, decades of residing in Britain has inevitably changed the tastes and eating practices of many British Asian communities – whose connection with traditional foods has become increasingly tenuous.

In his book Spicing Up Britain: The Multicultural History of British Food, Panayai charts the patterns of migration and the influences of food, taste and consumption habits. He follows the tastes of British Asians who have grown up with a fusion of tastes and influences all their life.

These are people whose diets reflect the variants of English food their parents invented to make use of the ingredients readily available to them – as opposed to just tastes from the Indian subcontinent. It meant childhood classics became spicy cheese on toast or baked Beans Balti with spring onion sabji and masala burgers.

Merging of tastes

Panayai claims that the taste of South Asian food became as much a part of the childhood tastes of white British children living in certain areas of the UK as their second and third generation Asian school friends.

In the London borough of Tower Hamlets for example – which is home to a large Bangladeshi community – local councillors played a significant role in influencing the content of school dinners. As early as the 1980s these lunches often included Asian vegetarian dishes, such as chapattis, rice and halal meat alongside “English” staples of chips, peas and steamed sponge with custard.

Fish and chips and curry sauce – a British speciality.
Flickr/Liz Barker, CC BY-NC

These tastes shaped the palates of many British children, to the point where a combination of “English” food and “curry” became the nostalgic taste of childhood. This was commodified by major brands such as Bisto with their “curry sauce” gravy granules.

These combinations are still a main feature of many “greasy spoon” English cafes or pub menus – which feature British staples such as curry served with a choice of either rice or chips, or jacket potatoes with a spicy chicken tikka filling. Then there’s the coronation chicken sandwich – a blend of boiled chicken, curry powder, mayonnaise and sultanas – a nod to the dish created for Queen Elizabeth II Coronation lunch in 1953.

More recently, in a time of gastronomic obsession and “foodie” culture, the “hybridisation” of cuisines has shifted from being a matter of necessity – due to availability of ingredients – to an increasingly sophisticated, cosmopolitan and fashionable food trend.

‘One spicy crab coming right up’.
Shutterstock

The influential taste of the British curry can now be identified on modern British fine dining menus, where fillets of Scottish salmon, hand-dived scallops and Cornish crabmeat are infused with spiced cumin, turmeric and fenugreek. While bread and butter pudding is laced with cardamom and saffron.

Multicultural Britain

But in the current political climate of migration restrictions, the free movement of people across borders looks ever more threatened – and with it our rich cultural heritage as a multicultural country is also under threat.

As diverse as the food on our plates.
Shutterstock

This will undoubtedly have a detrimental impact on imported food produce and ingredients. And it will also impact the diverse communities which have brought with them long histories of knowledge, recipes and cooking practices.

Of course, throughout history there has always been a degree of racism and resistance to “foreign” foods, but for the most part these tastes have become embraced and firmly appropriated into the British diet.

Perhaps then we can take heart during this uncertain time that merging cultures will be a British tradition that is set to continue. Because what started as the “taste of the other” is now so deeply ingrained in our food, culture and identity that it is no longer possible to disentangle national, regional or local tastes to determine what belongs where.

Emily Falconer, Lecturer in Sociology, University of Westminster

This article was originally published on The Conversation. Read the original article.

Things you were taught at school that are wrong


Capture.JPG

Misty Adoniou, University of Canberra

Do you remember being taught you should never start your sentences with “And” or “But”?

What if I told you that your teachers were wrong and there are lots of other so-called grammar rules that we’ve probably been getting wrong in our English classrooms for years?

How did grammar rules come about?

To understand why we’ve been getting it wrong, we need to know a little about the history of grammar teaching.

Grammar is how we organise our sentences in order to communicate meaning to others.

Those who say there is one correct way to organise a sentence are called prescriptivists. Prescriptivist grammarians prescribe how sentences must be structured.

Prescriptivists had their day in the sun in the 18th century. As books became more accessible to the everyday person, prescriptivists wrote the first grammar books to tell everyone how they must write.

These self-appointed guardians of the language just made up grammar rules for English, and put them in books that they sold. It was a way of ensuring that literacy stayed out of reach of the working classes.

They took their newly concocted rules from Latin. This was, presumably, to keep literate English out of reach of anyone who wasn’t rich or posh enough to attend a grammar school, which was a school where you were taught Latin.

And yes, that is the origin of today’s grammar schools.

The other camp of grammarians are the descriptivists. They write grammar guides that describe how English is used by different people, and for different purposes. They recognise that language isn’t static, and it isn’t one-size-fits-all.

1. You can’t start a sentence with a conjunction

Let’s start with the grammatical sin I have already committed in this article. You can’t start a sentence with a conjunction.

Obviously you can, because I did. And I expect I will do it again before the end of this article. There, I knew I would!

Those who say it is always incorrect to start a sentence with a conjunction, like “and” or “but”, sit in the prescriptivist camp.

However, according to the descriptivists, at this point in our linguistic history,
it is fine to start a sentence with a conjunction in an op-ed article like this, or in a novel or a poem.

It is less acceptable to start a sentence with a conjunction in an academic journal article, or in an essay for my son’s high school economics teacher, as it turns out. But times are changing.

2. You can’t end a sentence with a preposition

Well, in Latin you can’t. In English you can, and we do all the time.

Admittedly a lot of the younger generation don’t even know what a preposition is, so this rule is already obsolete. But let’s have a look at it anyway, for old time’s sake.

According to this rule, it is wrong to say “Who did you go to the movies with?”

Instead, the prescriptivists would have me say “With whom did you go to the movies?”

I’m saving that structure for when I’m making polite chat with the Queen on my next visit to the palace.

That’s not a sarcastic comment, just a fanciful one. I’m glad I know how to structure my sentences for different audiences. It is a powerful tool. It means I usually feel comfortable in whatever social circumstances I find myself in, and I can change my writing style according to purpose and audience.

That is why we should teach grammar in schools. We need to give our children a full repertoire of language so that they can make grammatical choices that will allow them to speak and write for a wide range of audiences.

3. Put a comma when you need to take a breath

It’s a novel idea, synchronising your writing with your breathing, but the two have nothing to do with one another and if this is the instruction we give our children, it is little wonder commas are so poorly used.

Punctuation is a minefield and I don’t want to risk blowing up the internet. So here is a basic description of what commas do, and read this for a more comprehensive guide.

Commas provide demarcation between like grammatical structures. When adjectives, nouns, phrases or clauses are butting up against each other in a sentence, we separate them with a comma. That’s why I put commas between the three nouns and the two clauses in that last sentence.

Commas also provide demarcation for words, phrases or clauses that are embedded in a sentence for effect. The sentence would still be a sentence even if we took those words away. See, for example, the use of commas in this sentence.

4. To make your writing more descriptive, use more adjectives

American writer Mark Twain had it right.

“When you catch an adjective, kill it. No, I don’t mean utterly, but kill most of them – then the rest will be valuable.”

If you want your writing to be more descriptive, play with your sentence structure.

Consider this sentence from Liz Lofthouse’s beautiful children’s book Ziba came on a boat. It comes at a key turning point in the book, the story of a refugee’s escape.

“Clutching her mother’s hand, Ziba ran on and on, through the night, far away from the madness until there was only darkness and quiet.”

A beautifully descriptive sentence, and not an adjective in sight.

5. Adverbs are the words that end in ‘ly’

Lots of adverbs end in “ly”, but lots don’t.

Adverbs give more information about verbs. They tell us when, where, how and why the verb happened. So that means words like “tomorrow”, “there” and “deep” can be adverbs.

I say they can be adverbs because, actually, a word is just a word. It becomes an adverb, or a noun, or an adjective, or a verb when it is doing that job in a sentence.

Deep into the night, and the word deep is an adverb. Down a deep, dark hole and it is an adjective. When I dive into the deep, it is doing the work of a noun.

Time to take those word lists of adjectives, verbs and nouns off the classroom walls.

Time, also, to ditch those old Englishmen who wrote a grammar for their times, not ours.

If you want to understand what our language can do and how to use it well, read widely, think deeply and listen carefully. And remember, neither time nor language stands still – for any of us.

The Conversation

Misty Adoniou, Associate Professor in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.

Why was Shakespeare’s death such a non-event at the time?


shakespeare

Ian Donaldson, University of Melbourne

William Shakespeare died on 23 April 1616, 400 years ago, in the small Warwickshire town of his birth. He was 52 years of age: still young (or youngish, at least) by modern reckonings, though his death mightn’t have seemed to his contemporaries like an early departure from the world.

Most of the population who survived childhood in England at this time were apt to die before the age of 60, and old age was a state one entered at what today might be thought a surprisingly youthful age.

Many of Shakespeare’s fellow-writers had died, or were soon to do so, at a younger age than he: Christopher Marlowe, in a violent brawl, at 29; Francis Beaumont, following a stroke, at 31 (also in 1616: just 48 days, as it happened, before Shakespeare’s own death); Robert Greene, penitent and impoverished, of a fever, in the garret of a shoemaker’s house, at 34; Thomas Kyd, after “bitter times and privy broken passions”, at 35; George Herbert, of consumption, at 39; John Fletcher, from the plague, at 46; Edmund Spenser, “for lack of bread” (so it was rumoured), at 47; and Thomas Middleton, also at 47, from causes unknown.

The cause or causes of Shakespeare’s death are similarly unknown, though in recent years they have become a topic of persistent speculation. Syphilis contracted by visits to the brothels of Turnbull Street, mercury or arsenic poisoning following treatment for this infection, alcoholism, obesity, cardiac failure, a sudden stroke brought on by the alarming news of a family disgrace – that Shakespeare’s son-in-law, Thomas Quiney, husband of his younger daughter, Judith, had been responsible for the pregnancy and death of a young local woman named Margaret Wheeler – have all been advanced as possible factors leading to Shakespeare’s death.

A portrait of Shakespeare from the First Folio of his plays.

Francis Thackeray, Director of the Institute for Human Evolution at the University of Witwatersrand, believes that cannabis was the ultimate cause of Shakespeare’s death, and has been hoping – in defiance of the famous ban on Shakespeare’s tomb (“Curst be he that moves my bones”, etc.) to inspect the poet’s teeth in order to confirm this theory. (“Teeth are not bones”, Dr Thackeray somewhat controversially insists.) No convincing evidence, alas, has yet been produced to support any of these theories.

More intriguing than the actual pathology of Shakespeare’s death, however, may be another set of problems that have largely evaded the eye of biographers, though they seem at times – in a wider, more general sense – to have held the poet’s own sometimes playful attention. They turn on the question of fame: how it is constituted; how slowly and indirectly it’s often achieved, how easily it may be delayed, diverted, or lost altogether from view.

No memorial gathering

On 25 April 1616, two days after his death, Shakespeare was buried in the chancel of Holy Trinity Church at Stratford, having earned this modest place of honour as much (it would seem) through his local reputation as a respected citizen as from any deep sense of his wider professional achievements.

No memorial gatherings were held in the nation’s capital, where he had made his career, or, it would seem, elsewhere in the country. The company of players that he had led for so long did not pause (so far as we know) to acknowledge his passing, nor did his patron and protector, King James, whom he had loyally served.

Only one writer, a minor Oxfordshire poet named William Basse, felt moved to offer, at some unknown date following his death, a few lines to the memory of Shakespeare, with whom he may not have been personally acquainted. Hoping that Shakespeare might be interred at Westminster but foreseeing problems of crowding at the Abbey, Basse began by urging other distinguished English poets to roll over in their tombs, in order to make room for the new arrival.

Renownèd Spenser, lie a thought more nigh.

To learned Chaucer; and rare Beaumont, lie

A little nearer Spenser, to make room

For Shakespeare in your threefold, fourfold tomb.

None of these poets responded to Basse’s injunctions, however, and Shakespeare was not to win his place in the Abbey for more than a hundred years, when Richard Boyle, third Earl of Burlington, commissioned William Kent to design and Peter Scheemakers to sculpt this life-size white marble statue of the poet – standing cross-legged, leaning thoughtfully on a pile of books – to adorn Poets’ Corner.

A Derby porcelain figure of Shakespeare modelled after the statue of 1741 by Peter Scheemakers in Poets’ Corner.
Wikimeida images

On the wall behind this statue, erected in the Abbey in January 1741, is a tablet with a Latin inscription (perhaps contributed by the poet Alexander Pope) conceding the belated arrival of the memorial: “William Shakespeare,/124 years after his death/ erected by public love”.

Basse’s verses were in early circulation, but not published until 1633. No other poem to Shakespeare’s memory is known to have been written before the appearance of the First Folio in 1623. No effort appears to have been made in the months and years following the poet’s death to assemble a tributary volume, honouring the man and his works. None of Shakespeare’s other contemporaries noted the immediate fact of his passing in any surviving letter, journal, or record. No dispatches, private or diplomatic, carried the news of his death beyond Britain to the wider world.

Why did the death of Shakespeare cause so little public grief, so little public excitement, in and beyond the country of his birth? Why wasn’t his passing an occasion for widespread mourning, and widespread celebration of his prodigious achievements? What does this curious silence tell us about Shakespeare’s reputation in 1616; about the status of his profession and the state of letters more generally in Britain at this time?

A very quiet death

Shakespeare’s death occurred upon St George’s Day. That day was famous for the annual rites of prayer, procession, and feasting at Windsor by members of the Order of the Garter, England’s leading chivalric institution, founded in 1348 by Edward III. Marking as it did the anniversary of the supposed martyrdom in AD 303 of St George of Cappadocia, St George’s Day was celebrated in numerous countries in and beyond Europe, as it is today, but had emerged somewhat bizarrely in late mediaeval times as a day of national significance in England.

Tourists watch actors perform at the house where William Shakespeare was born during celebrations to mark the 400th anniversary of his death.
Dylan Martinez/Reuters

On St George’s Day 1616, as Shakespeare lay dying in far-off Warwickshire, King James – seemingly untroubled by prior knowledge of this event – was entertained in London by a poet of a rather different order named William Fennor.

Fennor was something of a royal favourite, famed for his facetious contests in verse, often in the King’s presence, with the Thames bargeman, John Taylor, the so-called Water Poet: a man whom James – as Ben Jonson despairingly reported to William Drummond – reckoned to be the finest poet in the kingdom.

In the days and weeks that followed, as the news of the poet’s death (one must assume) filtered gradually through to the capital, there is no recorded mention in private correspondence or official documents of Shakespeare’s name. Other more pressing matters were now absorbing the nation. Shakespeare had made a remarkably modest exit from the theatre of the world: largely un-applauded, largely unobserved. It was a very quiet death.

An age of public mourning

The silence that followed the death of Shakespeare is the more remarkable coming as it did in an age that had developed such elaborate rituals of public mourning, panegyric, and commemoration, most lavishly displayed at the death of a monarch or peer of the realm, but also occasionally set in train by the death of an exceptional commoner.

Consider the tributes paid to another great writer of the period, William Camden, antiquarian scholar and Clarenceux herald of arms, who died in London in late November 1623; a couple of weeks, as chance would have it, after the publication of Shakespeare’s First Folio.

Portrait of William Camden by Marcus Gheeraerts the Younger (1609).
Wikimedia commons

Camden was a man of quite humble social origins – like Shakespeare himself, whose father was a maker of gloves and leather goods in Stratford. Camden’s father was a painter-stainer, whose job it was to decorate coats of arms and other heraldic devices. By the time of his death Camden was widely recognized, in Britain and abroad, as one of the country’s outstanding scholars.

Eulogies were delivered at Oxford and published along with other tributes in a memorial volume soon after his death. At Westminster his body was escorted to the Abbey on 19 November by a large retinue of mourners, led by 26 poor men wearing gowns, followed by soberly attired gentlemen, esquires, knights, and members of the College of Arms, the hearse being flanked by earls, barons, and other peers of the realm, together with the Lord Keeper, Bishop John Williams, and other divines. Camden’s imposing funeral mirrored on a smaller scale the huge procession of 1,600 mourners which in 1603 had accompanied the body of Elizabeth I to its final resting place in the Abbey.

There were particular reasons, then, why Camden should have been accorded a rather grand funeral of his own. But mightn’t there have been good reasons for Shakespeare, likewise – whom we see today as the outstanding writer of his age – to have been honoured at his death in a suitably ceremonious fashion? It’s curious to realize, however, that Shakespeare at the time of his death wasn’t yet universally seen as the outstanding writer of his age.

Ben Jonson by George Vertue (1684-1786) after Gerard van Honthorst (1590-1656).
Wikimedia images

At this quite extraordinary moment in the history of English letters and intellectual exchange there was more than one contender for that title. William Camden himself – an admired poet in addition to his other talents, and friend and mentor of other poets of the day – had included Shakespeare’s name in a list, published in 1614, of “the most pregnant wits of these our times, whom succeeding ages may justly admire”, placing him, without differentiation, alongside Edmund Spenser, John Owen, Thomas Campion, Michael Drayton, George Chapman, John Marston, Hugh Holland and Ben Jonson, the last two of whom he had taught at Westminster School.

But it was another poet, Sir Philip Sidney, whom Camden had befriended during his student days at Oxford, that he most passionately admired, and continued to regard – following Sidney’s early death at the age of 32 in 1586 – as the country’s supreme writer. “Our Britain is the glory of earth and its precious jewel,/ But Sidney was the precious jewel of Britain”, Camden had written in a memorial poem in Latin mourning his friend’s death.

No commoner poet in England had ever been escorted to his grave with such pomp as was furnished for Sidney’s funeral at St Paul’s Cathedral, London, on 16 February 1587.

The 700-man procession was headed by 32 poor men, representing the number of years that Sidney had lived, with fifes and drums “playing softly” beside them. They were followed by trumpeters and gentlemen and yeomen servants, physicians, surgeons, chaplains, knights and esquires, heralds bearing aloft Sidney’s spurs and gauntlet, his helm and crest, his sword and targe, his coat of arms. Then came the hearse containing Sidney’s body. Behind them walked the chief mourner, Philip’s young brother, Robert, accompanied by the Earls of Leicester, Pembroke, Huntingdon, and Essex, followed by representatives from the states of Holland and Zealand. Next came the Lord Mayor and Aldermen of the City of London, with 120 members of the Company of Grocers, and, at the rear of the procession, “citizens of London practised in arms, about 300, who marched three by three”.

A 1587 engraving by Theodor de Bry showing the casket of Sir Philip Sidney carried by pallbearers.
Wikimedia images

Sidney’s funeral was a moving salute to a man who was widely admired not just for his military, civic and diplomatic virtues, but as the outstanding writer of his day. He fulfilled in exemplary fashion, as Shakespeare curiously did not, the Renaissance ideal of what a poet should strive to be.

In an extraordinary act of homage not before seen in England, but soon to be commonly followed at the death of distinguished writers, the Universities of Oxford and Cambridge produced three volumes of Latin verse lauding Sidney’s achievements, while a fourth volume of similar tributes was published by the University of Leiden. The collection from Cambridge, presented contributions from 63 Cambridge men, together with a sonnet in English by King James VI of Scotland, the future King James I of Britain.

Earlier English poets had been mourned at their passing, if not in these terms and not on this scale, then with more enthusiasm than was evident at the death of Shakespeare. Edmund Spenser at his death in 1599 was buried in Westminster Abbey next to Chaucer, “this hearse being attended by poets, and mournful elegies and poems with the pens that wrote them thrown into his tomb”. The deaths of Thomas Wyatt and Michael Drayton were similarly lamented.

When, 21 years after Shakespeare’s death, his former friend and colleague Ben Jonson came at last to die, the crowd that gathered at his house in Westminster to accompany his body to his grave in the Abbey included “all or the greatest part of the nobility and gentry then in the town”. Within months of his death a volume of 33 poems was in preparation and a dozen additional elegies had appeared in print. Jonson was hailed at his death as “king of English poetry”, as England’s “rare arch-poet”. With his death, as more than one memorialist declared, English poetry itself now seemed also to have died. No one had spoken in these terms at the death of Shakespeare.

To take one last example: at the death in 1643 of the dramatist William Cartwright whose works and whose very name are barely known to most people today – Charles I elected to wear black, remarking that

since the muses had so much mourned for the loss of such a son, it would be a shame for him not to appear in mourning for the loss of such a subject.

At the death of Shakespeare in 1616 James had shown no such minimal courtesy.

Backroom boys

Why should Shakespeare at his death have been so neglected? One simple answer is that King James, unlike his son, Charles, had no great passion for the theatre, and no very evident regard for Shakespeare’s genius. Early in his reign, so Dudley Carleton reported,

The first holy days we had every night a public play in the great hall, at which the King was ever present, and liked or disliked as he saw cause: but it seems he takes no extraordinary pleasure in them.

But Shakespeare and his company were not merely royal servants, bound to provide a steady supply of dramatic entertainment at court; they also catered for the London public who flocked to see their plays at Blackfriars and the Globe, and who had their own ways of expressing their pleasure, their frustrations, and – at the death of a player – their grief.

A portrait of the actor Richard Burbage.
Wikimedia images

When Richard Burbage, the principal actor for the King’s Men, died on 9 March 1619, just seven days after the death of Queen Anne, the London public were altogether more upset by that event than they had been over the death of the Queen, as one contemporary writer – quoting, ironically, the opening lines of Shakespeare’s 1 Henry VI – tartly observed.

So it’s necessary, I think, to pose a further question. Why should the death of Burbage have affected the London public more profoundly than the death not merely of the Queen but of the dramatist whose work he so skilfully interpreted?

I believe the answer lies, partly at least, in the status of the profession to which Shakespeare belonged, a profession which didn’t yet have a regular name: the very words playwright and dramatist not entering the language until half a century after Shakespeare’s death.

Prominent actors at this time were far better known to the public than the writers who provided their livelihood. The writers were on the whole invisible people, who worked as backroom boys, often anonymously and in small teams; playgoers had no easy way of discovering their identity. Theatre programmes didn’t yet exist. Playbills often announced the names of leading actors, but not until the very last decade of the 17th century did they include the names of authors.

Only a fraction of the large number of plays performed in this period moreover found their way into print, and those that were published didn’t always disclose the names of their authors.

At the time of Shakespeare’s death half of his plays weren’t yet available in print, and there were no known plans to produce a collected edition of his works. The total size and shape of the canon were therefore still imperfectly known. Shakespeare was not yet fully visible.

In 1616 the world didn’t yet realise what they had got, or who it was that they’d lost. Hence, I believe, the otherwise inexplicable silence at his passing.

To the Memory of My Beloved

At the time of Shakespeare’s death another English writer was arguably better known to the general public than Shakespeare himself, and more highly esteemed by the brokers of power at King James’s court. That writer was Shakespeare’s friend and colleague Ben Jonson, who early in 1616 had been awarded a pension of one hundred marks to serve as King James’s laureate poet.

A 1623 copy of the calf-bound First Folio edition of William Shakespeare’s plays.
Dylan Martinez/Reuters

A first folio edition of Shakespeare’s collected plays was finally published in London with Jonson’s assistance and oversight in 1623. This monumental volume at last gave readers in England some sense of the wider reach of Shakespeare’s theatrical achievement, and laid the essential foundations of his modern reputation.

At the head of this volume stand two poems by Ben Jonson: the second, To the Memory of My Beloved, the Author, Mr William Shakespeare, and What He Hath Left Us assesses the achievement of this extraordinary writer. Shakespeare had been praised during his lifetime as a “sweet”, “mellifluous”, “honey-tongued”, “honey-flowing”, “pleasing” writer. No one until this moment had presented him in the astounding terms that Jonson here proposes: as the pre-eminent figure, the “soul” and the “star” of his age; and as something even more than that: as one who could be confidently ranked with the greatest writers of antiquity and of the modern era.

Triumph, my Britain, thou has one to show

To whom all scenes of Europe homage owe,

He was not of an age, but for all time!

Today, 400 years on, that last line sounds like a truism, for Shakespeare’s fame has indeed endured. He is without doubt the most famous writer the world has ever seen. But in 1623 this was a bold and startling prediction. No one before that date had described Shakespeare’s achievement in such terms as these.

This is an edited version of a public lecture given at the University of Melbourne.

On the 400th anniversary of Shakespeare’s death, the Faculty of Arts at the University of Melbourne is establishing the Shakespeare 400 Trust to raise funds to support the teaching of Shakespeare at the University into the future. For more information, or if you would like to support the Shakespeare 400 Trust, please contact Julie du Plessis at julie.dp@unimelb.edu.au

The Conversation

Ian Donaldson, Honorary Professorial Fellow, School of Culture and Communication, University of Melbourne

This article was originally published on The Conversation. Read the original article.

How training can prepare teachers for diversity in their classrooms


capture

Maureen Robinson, Stellenbosch University

Teachers have been shaping lives for centuries. Everyone remembers their favourite (and of course their least favourite) teachers. This important group of people even has its own special day, marked each October by the United Nations.

Teachers are at the coal face when it comes to watching societies change. South Africa’s classrooms, for instance, look vastly different today than they did two decades ago. They bring together children from different racial, cultural, economic and social backgrounds. This can sometimes cause conflict as varied ways of understanding the world bump up against each other.

How can teachers develop the skills to work with these differences in productive ways? What practical support do they need to bring the values of the Constitution to life in their classes?

To answer these questions, my colleagues and I in the Faculty of Education at Stellenbosch University have put together four examples from modules within our faculty’s teacher education programme. These ideas are by no means exhaustive; other institutions also tackle these issues. What we present here is based on our own research, teaching and experience and is open to further discussion.

1. Working with multilingualism

English is only South Africa’s fifth most spoken home language. Teachers must remember this: even if their pupils are speaking English in the classroom, their home languages may be far more diverse.

Trainee teachers can benefit enormously from a course on multilingual education. In our faculty, for instance, students are given the chance to place multilingual education in a South African policy framework. They model multilingual classroom strategies like code switching and translation. They visit schools to observe how such strategies are applied in the real classroom. Students then report back on whether this approach helps learners from different language backgrounds to participate actively in the lesson.

There’s also great value in introducing student teachers to the notion of “World Englishes”. This focuses on the role of English in multilingual communities, where it is seen as being used for communication and academic purposes rather than as a way for someone to be integrated into an English community.

2. Supporting diverse learning needs

Student teachers must be trained to identify and support pupils’ diverse learning needs. This helps teachers to identify and address barriers to learning and development and encourages linkages between the home and the school.

This is even more meaningful when it is embedded in experiential learning. For instance, in guided exercises with their own class groups, our students engage with their feelings, experiences and thinking about their own backgrounds and identities. Other activities may be based on real scenarios, such as discussing the case of a boy who was sanctioned by his school for wearing his hair in a way prescribed by his religion.

In these modules we focus on language, culture, race, socioeconomic conditions, disability, sexual orientation, learning differences and behavioural, health or emotional difficulties. The students also learn how to help vulnerable learners who are being bullied.

And these areas are constantly expanding. At Stellenbosch University, we’ve recently noted that we need to prepare teachers to deal with the bullying of LGBT learners. They also need to be equipped with the tools to support pupils who’ve immigrated from elsewhere in Africa.

3. Advancing a democratic classroom

Courses that deal with the philosophy of education are an important element of teacher education. These explore notions of diversity, human dignity, social justice and democratic citizenship.

In these classes, student teachers are encouraged to see their own lecture rooms as spaces for open and equal engagement, with regard and respect for different ways of being. They’re given opportunities to express and engage with controversial views. This stands them in good stead to create such spaces in their own classrooms.

Most importantly, students are invited to critically reconsider commonly held beliefs – and to disrupt their ideas of the world – so that they might encounter the other as they are and not as they desire them to be. In such a classroom, a teacher promotes discussion and debate. She cultivates respect and regard for the other by listening to different accounts and perspectives. Ultimately, the teacher accepts that she is just one voice in the classroom.

4. Understanding constitutional rights in the classroom

All the approaches to teacher education described here are underpinned by the Constitution.

The idea is that teacher education programmes should develop teachers who understand notions of justice, citizenship and social cohesion. Any good teacher needs to be able to reflect critically on their own role as leader and manager within the contexts of classrooms, schools and the broader society. This includes promoting values of democracy, social justice and equality, and building attitudes of respect and reciprocity.

A critical reflective ethos is encouraged. Students get numerous opportunities to interrogate, debate, research, express and reflect upon educational challenges, theories and policies, from different perspectives, as these apply to practice. This is all aimed at building a positive school environment for everyone.

Moving into teaching

What about when students become teachers themselves?

For many new teachers these inclusive practices are not easy to implement in schools. One lecturer in our faculty has been approached by former students who report that as beginner teachers, they don’t have “the status or voice to change existing discriminatory practices and what some experience as the resistance to inclusive education”. This suggests that ongoing discussion and training in both pre-service and in-service education is needed.

At the same time, however, there are signs that these modules are having a positive impact. Students post comments and ideas on social media and lecturers regularly hear from first-time teachers about how useful their acquired knowledge is in different contexts. Many are also eager to study further so they can explore the issues more deeply.

Everything I’ve described here is part of one faculty’s attempts to provide safe spaces where student teachers can learn to work constructively with the issues pertaining to diversity in education. In doing so, we hope they’ll become part of building a country based on respect for all.

Author’s note: I am grateful to my colleagues Lynette Collair, Nuraan Davids, Jerome Joorst and Christa van der Walt for the ideas contained in this article.

The Conversation

Maureen Robinson, Dean, Faculty of Education, Stellenbosch University

This article was originally published on The Conversation. Read the original article.

Clear skies ahead: how improving the language of aviation could save lives


article-0-1A519FCE000005DC-280_964x629 (1).jpg

Dominique Estival, Western Sydney University

The most dangerous part of flying is driving to the airport.

That’s a standard joke among pilots, who know even better than the flying public that aviation is the safest mode of transportation.

But there are still those headlines and TV shows about airline crashes, and those statistics people like to repeat, such as:

Between 1976 and 2000, more than 1,100 passengers and crew lost their lives in accidents in which investigators determined that language had played a contributory role.

True enough, 80% of all air incidents and accidents occur because of human error. Miscommunication combined with other human factors such as fatigue, cognitive workload, noise, or forgetfulness have played a role in some of the deadliest accidents.

The most well-known, and widely discussed, is the collision on the ground of two Boeing 747 aircraft in 1977 in Tenerife, which resulted in 583 fatalities. The incident was due in part to difficult communications between the pilot, whose native language was Dutch, and the Spanish air traffic controller.

In such a high-stakes environment as commercial aviation, where the lives of hundreds of passengers and innocent people on the ground are involved, communication is critical to safety.

So, it was decided that Aviation English would be the international language of aviation and that all aviation professionals – pilots and air traffic controllers (ATC) – would need to be proficient in it. It is a language designed to minimise ambiguities and misunderstandings, highly structured and codified.

Pilots and ATC expect to hear certain bits of information in certain ways and in a given order. The “phraseology”, with its particular pronunciation (for example, “fife” and “niner” instead of “five” and “nine”, so they’re not confused with each other), specific words (“Cleared to land”), international alphabet (“Mike Hotel Foxtrot”) and strict conversation rules (you must repeat, or “read back”, an instruction), needs to be learned and practised.

In spite of globalisation and the spread of English, most people around the world are not native English speakers, and an increasing number of aviation professionals do not speak English as their first language.

Native speakers have an advantage when they learn Aviation English, since they already speak English at home and in their daily lives. But they encounter many pilots or ATC who learned English as a second or even third language.

Whose responsibility is it to ensure that communication is successful? Can native speakers simply speak the way they do at home and expect to be understood? Or do they also have the responsibility to make themselves understood and to learn how to understand pilots or ATC who are not native English speakers?

As a linguist, I analyse aviation language from a linguistics perspective. I have noted the restricted meaning of the few verbs and adjectives; that the only pronouns are “you” and sometimes “we” (“How do you read?”; “We’re overhead Camden”; how few questions there are, mostly imperatives (“Maintain heading 180”); and that the syntax is so simple (no complement clauses, no relative clauses, no recursion), it might not even count as a human language for Chomsky.

But, as a pilot and a flight instructor, I look at it from the point of view of student pilots learning to use it in the cockpit while also learning to fly the airplane and navigate around the airfield.

How much harder it is to remember what to say when the workload goes up, and more difficult to speak over the radio when you know everyone else on the frequency is listening and will notice every little mistake you make?

Imagine, then, how much more difficult this is for pilots with English as a second language.

Camden Airport.
Supplied

Everyone learning another language knows it’s suddenly more challenging to hold a conversation over the phone than face-to-face, even with someone you already know. When it’s over the radio, with someone you don’t know, against the noise of the engine, static noise in the headphones, and while trying to make the plane do what you want it to do, it can be quite daunting.

No wonder student pilots who are not native English speakers sometimes prefer to stay silent, and even some experienced native English speakers will too, when the workload is too great.

This is one of the results of my research conducted in collaboration with UNSW’s Brett Molesworth, combining linguistics and aviation human factors.

Experiments in a flight simulator with pilots of diverse language backgrounds and flying experience explored conditions likely to result in pilots making mistakes or misunderstanding ATC instructions. Not surprisingly, increased workload, too much information, and rapid ATC speech, caused mistakes.

Also not surprisingly, less experienced pilots, no matter their English proficiency, made more mistakes. But surprisingly, it was the level of training, rather than number of flying hours or language background, that predicted better communication.

Once we understand the factors contributing to miscommunication in aviation, we can propose solutions to prevent them. For example, technologies such as Automatic Speech Recognition and Natural Language Understanding may help catch errors in pilot readbacks that ATC did not notice and might complement training for pilots and ATC.

It is vital that they understand each other, whatever their native language.

The Conversation

Dominique Estival, Researcher in Linguistics, Western Sydney University

This article was originally published on The Conversation. Read the original article.