British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

Advertisements

How learning a new language improves tolerance


Image 20161208 31364 1yz4g47
Why learn a new language?
Timothy Vollmer, CC BY

Amy Thompson, University of South Florida

There are many benefits to knowing more than one language. For example, it has been shown that aging adults who speak more than one language have less likelihood of developing dementia.

Additionally, the bilingual brain becomes better at filtering out distractions, and learning multiple languages improves creativity. Evidence also shows that learning subsequent languages is easier than learning the first foreign language.

Unfortunately, not all American universities consider learning foreign languages a worthwhile investment.

Why is foreign language study important at the university level?

As an applied linguist, I study how learning multiple languages can have cognitive and emotional benefits. One of these benefits that’s not obvious is that language learning improves tolerance.

This happens in two important ways.

The first is that it opens people’s eyes to a way of doing things in a way that’s different from their own, which is called “cultural competence.”

The second is related to the comfort level of a person when dealing with unfamiliar situations, or “tolerance of ambiguity.”

Gaining cross-cultural understanding

Cultural competence is key to thriving in our increasingly globalized world. How specifically does language learning improve cultural competence? The answer can be illuminated by examining different types of intelligence.

Psychologist Robert Sternberg’s research on intelligence describes different types of intelligence and how they are related to adult language learning. What he refers to as “practical intelligence” is similar to social intelligence in that it helps individuals learn nonexplicit information from their environments, including meaningful gestures or other social cues.

Learning a foreign language reduces social anxiety.
COD Newsroom, CC BY

Language learning inevitably involves learning about different cultures. Students pick up clues about the culture both in language classes and through meaningful immersion experiences.

Researchers Hanh Thi Nguyen and Guy Kellogg have shown that when students learn another language, they develop new ways of understanding culture through analyzing cultural stereotypes. They explain that “learning a second language involves the acquisition not only of linguistic forms but also ways of thinking and behaving.”

With the help of an instructor, students can critically think about stereotypes of different cultures related to food, appearance and conversation styles.

Dealing with the unknown

The second way that adult language learning increases tolerance is related to the comfort level of a person when dealing with “tolerance of ambiguity.”

Someone with a high tolerance of ambiguity finds unfamiliar situations exciting, rather than frightening. My research on motivation, anxiety and beliefs indicates that language learning improves people’s tolerance of ambiguity, especially when more than one foreign language is involved.

It’s not difficult to see why this may be so. Conversations in a foreign language will inevitably involve unknown words. It wouldn’t be a successful conversation if one of the speakers constantly stopped to say, “Hang on – I don’t know that word. Let me look it up in the dictionary.” Those with a high tolerance of ambiguity would feel comfortable maintaining the conversation despite the unfamiliar words involved.

Applied linguists Jean-Marc Dewaele and Li Wei also study tolerance of ambiguity and have indicated that those with experience learning more than one foreign language in an instructed setting have more tolerance of ambiguity.

What changes with this understanding

A high tolerance of ambiguity brings many advantages. It helps students become less anxious in social interactions and in subsequent language learning experiences. Not surprisingly, the more experience a person has with language learning, the more comfortable the person gets with this ambiguity.

And that’s not all.

Individuals with higher levels of tolerance of ambiguity have also been found to be more entrepreneurial (i.e., are more optimistic, innovative and don’t mind taking risks).

In the current climate, universities are frequently being judged by the salaries of their graduates. Taking it one step further, based on the relationship of tolerance of ambiguity and entrepreneurial intention, increased tolerance of ambiguity could lead to higher salaries for graduates, which in turn, I believe, could help increase funding for those universities that require foreign language study.

Those who have devoted their lives to theorizing about and the teaching of languages would say, “It’s not about the money.” But perhaps it is.

Language learning in higher ed

Most American universities have a minimal language requirement that often varies depending on the student’s major. However, students can typically opt out of the requirement by taking a placement test or providing some other proof of competency.

Why more universities should teach a foreign language.
sarspri, CC BY-NC

In contrast to this trend, Princeton recently announced that all students, regardless of their competency when entering the university, would be required to study an additional language.

I’d argue that more universities should follow Princeton’s lead, as language study at the university level could lead to an increased tolerance of the different cultural norms represented in American society, which is desperately needed in the current political climate with the wave of hate crimes sweeping university campuses nationwide.

Knowledge of different languages is crucial to becoming global citizens. As former Secretary of Education Arne Duncan noted,

“Our country needs to create a future in which all Americans understand that by speaking more than one language, they are enabling our country to compete successfully and work collaboratively with partners across the globe.”

The ConversationConsidering the evidence that studying languages as adults increases tolerance in two important ways, the question shouldn’t be “Why should universities require foreign language study?” but rather “Why in the world wouldn’t they?”

Amy Thompson, Associate Professor of Applied Linguistics, University of South Florida

This article was originally published on The Conversation. Read the original article.

 

Is there such a thing as a national sense of humour?


File 20170503 21630 1p64l4e
A statue celebrating Monty Python’s sketch The Dead Parrot near London’s Tower Bridge ahead of a live show on the TV channel Gold.
DAVID HOLT/Flickr, CC BY-SA

Gary McKeown, Queen’s University Belfast

We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual? The Conversation

There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.

Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.

Just for laughs.

These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.

Language and culture

Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.

Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.

Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.

International humour?
C.P.Storm/Flickr, CC BY-SA

Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.

Denigration and self-deprecation

There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.

Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.

Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.

Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.

Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.

‘Fork handles’.

For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.

These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.

A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.

Gary McKeown, Senior Lecturer of Psychology, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Accessible, engaging textbooks could improve children’s learning


Image 20170313 9408 bb6pp1
It’s not enough for textbooks just to be present in a classroom. They must support learning.
Global Partnership for Education/Flickr, CC BY-NC-ND

Lizzi O. Milligan, University of Bath

Textbooks are a crucial part of any child’s learning. A large body of research has proved this many times and in many very different contexts. Textbooks are a physical representation of the curriculum in a classroom setting. They are powerful in shaping the minds of children and young people. The Conversation

UNESCO has recognised this power and called for every child to have a textbook for every subject. The organisation argues that

next to an engaged and prepared teacher, well-designed textbooks in sufficient quantities are the most effective way to improve instruction and learning.

But there’s an elephant in the room when it comes to textbooks in African countries’ classrooms: language.

Rwanda is one of many African countries that’s adopted a language instruction policy which sees children learning in local or mother tongue languages for the first three years of primary school. They then transition in upper primary and secondary school into a dominant, so-called “international” language. This might be French or Portuguese. In Rwanda, it has been English since 2008.

Evidence from across the continent suggests that at this transition point, many learners have not developed basic literacy and numeracy skills. And, significantly, they have not acquired anywhere near enough of the language they are about to learn in to be able to engage in learning effectively.

I do not wish to advocate for English medium instruction, and the arguments for mother-tongue based education are compelling. But it’s important to consider strategies for supporting learners within existing policy priorities. Using appropriate learning and teaching materials – such as textbooks – could be one such strategy.

A different approach

It’s not enough to just hand out textbooks in every classroom. The books need to tick two boxes: learners must be able to read them and teachers must feel enabled to teach with them.

Existing textbooks tend not to take these concerns into consideration. The language is too difficult and the sentence structures too complex. The paragraphs too long and there are no glossaries to define unfamiliar words. And while textbooks are widely available to those in the basic education system, they are rarely used systematically. Teachers cite the books’ inaccessibility as one of the main reasons for not using them.

A recent initiative in Rwanda has sought to address this through the development of “language supportive” textbooks for primary 4 learners who are around 11 years old. These were specifically designed in collaboration with local publishers, editors and writers.

Language supportive textbooks have been shown to make a difference in some Rwandan classrooms.

There are two key elements to a “language supportive” textbook.

Firstly, they are written at a language level which is appropriate for the learner. As can be seen in Figure 1, the new concept is introduced in as simple English as possible. The sentence structure and paragraph length are also shortened and made as simple as possible. The key word (here, “soil”) is also repeated numerous times so that the learner becomes accustomed to this word.

University of Bristol and the British Council

Secondly, they include features – activities, visuals, clear signposting and vocabulary support – that enable learners to practice and develop their language proficiency while learning the key elements of the curriculum.

The books are full of relevant activities that encourage learners to regularly practice their listening, speaking, reading and writing of English in every lesson. This enables language development.

Crucially, all of these activities are made accessible to learners – and teachers – by offering support in the learners’ first language. In this case, the language used was Kinyarwanda, which is the first language for the vast majority of Rwandan people. However, it’s important to note that initially many teachers were hesitant about incorporating Kinyarwanda into their classroom practice because of the government’s English-only policy.

Improved test scores

The initiative was introduced with 1075 students at eight schools across four Rwandan districts. The evidence from our initiative suggests that learners in classrooms where these books were systematically used learnt more across the curriculum.

When these learners sat tests before using the books, they scored similar results to those in other comparable schools. After using the materials for four months, their test scores were significantly higher. Crucially, both learners and teachers pointed out how important it was that the books sanctioned the use of Kinyarwanda. The classrooms became bilingual spaces and this increased teachers’ and learners’ confidence and competence.

All of this supports the importance of textbooks as effective learning and teaching materials in the classroom and shows that they can help all learners. But authorities mustn’t assume that textbooks are being used or that the existing books are empowering teachers and learners.

Textbooks can matter – but it’s only when consideration is made for the ways they can help all learners that we can say that they can contribute to quality education for all.

Lizzi O. Milligan, Lecturer in International Education, University of Bath

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

The shelf-life of slang – what will happen to those ‘democracy sausages’?


sausage

Kate Burridge, Monash University

Every year around this time, dictionaries across the English-speaking world announce their “Word of the Year”. These are expressions (some newly minted and some golden oldies too) that for some reason have shot into prominence during the year.

Earlier this month The Australian National Dictionary Centre declared its winner “democracy sausage” – the barbecued snag that on election day makes compulsory voting so much easier to swallow.

Dictionaries make their selections in different ways, but usually it involves a combination of suggestions from the public and the editorial team (who have been meticulously tracking these words throughout the year). The Macquarie Dictionary has two selections – the Committee’s Choice made by the Word of the Year Committee, and the People’s Choice made by the public (so make sure you have your say on January 24 for the People’s Choice winner 2016).

It’s probably not surprising that these words of note draw overwhelmingly from slang language, or “slanguage” – a fall-out of the increasing colloquialisation of English usage worldwide. In Australia this love affair with the vernacular goes back to the earliest settlements of English speakers.

And now there’s the internet, especially social networking – a particularly fertile breeding ground for slang.

People enjoy playing with language, and when communicating electronically they have free rein. “Twitterholic”, “twaddiction”, “celebritweet/twit”, “twitterati” are just some of the “tweologisms” that Twitter has spawned of late. And with a reported average of 500 million tweets each day, Twitter has considerable capacity not only to create new expressions, but to spread them (as do Facebook, Instagram and other social networking platforms).

But what happens when slang terms like these make it into the dictionary? Early dictionaries give us a clue, particularly the entries that are stamped unfit for general use. Branded entries were certainly plentiful in Samuel Johnson’s 18th-century work, and many are now wholly respectable: abominably “a word of low or familiar language”, nowadays “barbarous usage”, fun “a low cant word” (what would Johnson have thought of very fun and funner?).

Since the point of slang is to mark an in-group, to amuse and perhaps even to shock outsiders with novelty, most slang expressions are short-lived. Those that survive become part of the mainstream and mundane. Quite simply, time drains them of their vibrancy and energy. J.M. Wattie put it more poetically back in 1930:

Slang terms are the mayflies of language; by the time they get themselves recorded in a dictionary, they are already museum specimens.

But, then again, expressions occasionally do sneak through the net. Not only do they survive, they stay slangy – and sometimes over centuries. Judge for yourselves. Here are some entries from A New and Comprehensive Vocabulary of the Flash Language. Written by British convict James Hardy Vaux in 1812, this is the first dictionary compiled in Australia.

croak “to die”

grub “food”

kid “deceive”

mug “face”

nuts on “to have a strong inclination towards something or someone”

on the sly “secretly”

racket “particular kind of fraud”

snitch “to betray”

stink “an uproar”

spin a yarn “tell a tale of great adventure”

These were originally terms of flash – or, as Vaux put it, “the cant language used by the family”. In other words, they belonged to underworld slang. The term slang itself meant something similar at this time; it broadened to highly colloquial language in the 1800s.

Vaux went on to point out that “to speak good flash is to be well versed in cant terms” — and, having been transported to New South Wales on three separate occasions during his “checkered and eventful life” (his words), Vaux himself was clearly well versed in the world of villainy and cant.

True, the majority of the slang terms here have dropped by the wayside (barnacles “spectacles”; lush “to drink”), and the handful that survives are now quite standard (grab “to seize”; dollop “large quantity”). But there are a few that have not only lasted, they’ve remained remarkably contemporary-sounding – some still even a little “disgraceful” (as Vaux described them).

The shelf-life of slang is a bit of mystery. Certainly some areas fray faster than others. Vaux’s prime, plummy and rum (meaning “excellent”) have well and truly bitten the dust. Cool might have made a comeback (also from the 1800s), but intensifiers generally wear out.

Far out and ace have been replaced by awesome, and there are plenty of new “awesome” words lurking in the wings. Some of these are already appearing on lists for “Most Irritating Word of the Year” – it’s almost as if their success does them in. Amazeballs, awesomesauce and phat are among the walking dead.

But as long as sausage sizzles continue to support Australian voters on election day, democracy sausages will have a place – and if adopted elsewhere, might even entice the politically uninterested into polling booths.

The Conversation

Kate Burridge, Professor of Linguistics, Monash University

This article was originally published on The Conversation. Read the original article.

Things you were taught at school that are wrong


Capture.JPG

Misty Adoniou, University of Canberra

Do you remember being taught you should never start your sentences with “And” or “But”?

What if I told you that your teachers were wrong and there are lots of other so-called grammar rules that we’ve probably been getting wrong in our English classrooms for years?

How did grammar rules come about?

To understand why we’ve been getting it wrong, we need to know a little about the history of grammar teaching.

Grammar is how we organise our sentences in order to communicate meaning to others.

Those who say there is one correct way to organise a sentence are called prescriptivists. Prescriptivist grammarians prescribe how sentences must be structured.

Prescriptivists had their day in the sun in the 18th century. As books became more accessible to the everyday person, prescriptivists wrote the first grammar books to tell everyone how they must write.

These self-appointed guardians of the language just made up grammar rules for English, and put them in books that they sold. It was a way of ensuring that literacy stayed out of reach of the working classes.

They took their newly concocted rules from Latin. This was, presumably, to keep literate English out of reach of anyone who wasn’t rich or posh enough to attend a grammar school, which was a school where you were taught Latin.

And yes, that is the origin of today’s grammar schools.

The other camp of grammarians are the descriptivists. They write grammar guides that describe how English is used by different people, and for different purposes. They recognise that language isn’t static, and it isn’t one-size-fits-all.

1. You can’t start a sentence with a conjunction

Let’s start with the grammatical sin I have already committed in this article. You can’t start a sentence with a conjunction.

Obviously you can, because I did. And I expect I will do it again before the end of this article. There, I knew I would!

Those who say it is always incorrect to start a sentence with a conjunction, like “and” or “but”, sit in the prescriptivist camp.

However, according to the descriptivists, at this point in our linguistic history,
it is fine to start a sentence with a conjunction in an op-ed article like this, or in a novel or a poem.

It is less acceptable to start a sentence with a conjunction in an academic journal article, or in an essay for my son’s high school economics teacher, as it turns out. But times are changing.

2. You can’t end a sentence with a preposition

Well, in Latin you can’t. In English you can, and we do all the time.

Admittedly a lot of the younger generation don’t even know what a preposition is, so this rule is already obsolete. But let’s have a look at it anyway, for old time’s sake.

According to this rule, it is wrong to say “Who did you go to the movies with?”

Instead, the prescriptivists would have me say “With whom did you go to the movies?”

I’m saving that structure for when I’m making polite chat with the Queen on my next visit to the palace.

That’s not a sarcastic comment, just a fanciful one. I’m glad I know how to structure my sentences for different audiences. It is a powerful tool. It means I usually feel comfortable in whatever social circumstances I find myself in, and I can change my writing style according to purpose and audience.

That is why we should teach grammar in schools. We need to give our children a full repertoire of language so that they can make grammatical choices that will allow them to speak and write for a wide range of audiences.

3. Put a comma when you need to take a breath

It’s a novel idea, synchronising your writing with your breathing, but the two have nothing to do with one another and if this is the instruction we give our children, it is little wonder commas are so poorly used.

Punctuation is a minefield and I don’t want to risk blowing up the internet. So here is a basic description of what commas do, and read this for a more comprehensive guide.

Commas provide demarcation between like grammatical structures. When adjectives, nouns, phrases or clauses are butting up against each other in a sentence, we separate them with a comma. That’s why I put commas between the three nouns and the two clauses in that last sentence.

Commas also provide demarcation for words, phrases or clauses that are embedded in a sentence for effect. The sentence would still be a sentence even if we took those words away. See, for example, the use of commas in this sentence.

4. To make your writing more descriptive, use more adjectives

American writer Mark Twain had it right.

“When you catch an adjective, kill it. No, I don’t mean utterly, but kill most of them – then the rest will be valuable.”

If you want your writing to be more descriptive, play with your sentence structure.

Consider this sentence from Liz Lofthouse’s beautiful children’s book Ziba came on a boat. It comes at a key turning point in the book, the story of a refugee’s escape.

“Clutching her mother’s hand, Ziba ran on and on, through the night, far away from the madness until there was only darkness and quiet.”

A beautifully descriptive sentence, and not an adjective in sight.

5. Adverbs are the words that end in ‘ly’

Lots of adverbs end in “ly”, but lots don’t.

Adverbs give more information about verbs. They tell us when, where, how and why the verb happened. So that means words like “tomorrow”, “there” and “deep” can be adverbs.

I say they can be adverbs because, actually, a word is just a word. It becomes an adverb, or a noun, or an adjective, or a verb when it is doing that job in a sentence.

Deep into the night, and the word deep is an adverb. Down a deep, dark hole and it is an adjective. When I dive into the deep, it is doing the work of a noun.

Time to take those word lists of adjectives, verbs and nouns off the classroom walls.

Time, also, to ditch those old Englishmen who wrote a grammar for their times, not ours.

If you want to understand what our language can do and how to use it well, read widely, think deeply and listen carefully. And remember, neither time nor language stands still – for any of us.

The Conversation

Misty Adoniou, Associate Professor in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.