British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

Advertisements

Is there such a thing as a national sense of humour?


File 20170503 21630 1p64l4e
A statue celebrating Monty Python’s sketch The Dead Parrot near London’s Tower Bridge ahead of a live show on the TV channel Gold.
DAVID HOLT/Flickr, CC BY-SA

Gary McKeown, Queen’s University Belfast

We’re all aware that there are stereotypes. The British are sharply sarcastic, the Americans are great at physical comedy, and the Japanese love puns. But is humour actually driven by culture to any meaningful extent? Couldn’t it be more universal – or depend largely on the individual? The Conversation

There are some good reasons to believe that there is such a thing as a national sense of humour. But let’s start with what we actually have in common, by looking at the kinds of humour that most easily transcend borders.

Certain kinds of humour are more commonly used in circumstances that are international and multicultural in nature – such as airports. When it comes to onoard entertainment, airlines, in particular, are fond of humour that transcends cultural and linguistic boundaries for obvious reasons. Slapstick humour and the bland but almost universally tolerable social transgressions and faux pas of Mr Bean permit a safe, gentle humour that we can all relate to. Also, the silent situational dilemmas of the Canadian Just for Laughs hidden camera reality television show has been a staple option for airlines for many years.

Just for laughs.

These have a broad reach and are probably unlikely to offend most people. Of course, an important component in their broad appeal is that they are not really based on language.

Language and culture

Most humour, and certainly humour that involves greater cognitive effort, is deeply embedded in language and culture. It relies on a shared language or set of culturally based constructs to function. Puns and idioms are obvious examples.

Indeed, most modern theories of humour suggest that some form of shared knowledge is one of the key foundations of humour – that is, after all, what a culture is.

Some research has demonstrated this. One study measured humour in Singaporean college students and compared it with that of North American and Israeli students. This was done using a questionnaire asking participants to describe jokes they found funny, among other things. The researchers found that the Americans were more likely to tell sex jokes than the Singaporeans. The Singaporean jokes, on the other hand, were slightly more often focused on violence. The researchers interpreted the lack of sex jokes among Singaporean students to be a reflection of a more conservative society. Aggressive jokes may be explained by a cultural emphasis on strength for survival.

International humour?
C.P.Storm/Flickr, CC BY-SA

Another study compared Japanese and Taiwanese students’ appreciation of English jokes. It found that the Taiwanese generally enjoyed jokes more than the Japanese and were also more eager to understand incomprehensible jokes. The authors argued that this could be down to a more hierarchical culture in Japan, leaving less room for humour.

Denigration and self-deprecation

There are many overarching themes that can be used to define a nation’s humour. A nation that laughs together is one that can show it has a strong allegiance between its citizens. Laughter is one of our main social signals and combined with humour it can emphasise social bonding – albeit sometimes at the cost of denigrating other groups. This can be seen across many countries. For example, the French tend to enjoy a joke about the Belgians while Swedes make fun of Norwegians. Indeed, most nations have a preferred country that serves as a traditional butt of their jokes.

Sexist and racist humour are also examples of this sort of denigration. The types of jokes used can vary across cultures, but the phenomenon itself can boost social bonding. Knowledge of acceptable social boundaries is therefore crucial and reinforces social cohesion. As denigration is usually not the principle aim of the interaction it shows why people often fail to realise that they are being offensive when they were “only joking”. However, as the world becomes more global and tolerant of difference, this type of humour is much less acceptable in cultures that welcome diversity.

Self-denigration or self-deprecation is also important – if it is relatively mild and remains within acceptable social norms. Benign violation theory argues that something that threatens social or cultural norms can also result in humour.

Importantly, what constitutes a benign level of harm is strongly culturally bound and differs from nation to nation, between social groups within nations and over the course of a nation’s history. What was once tolerable as national humour can now seem very unacceptable. For the British, it may be acceptable to make fun of Britons being overly polite, orderly or reluctant to talk to stangers. However, jokes about the nature of Britain’s colonial past would be much more contentious – they would probably violate social norms without being emotionally benign.

Another factor is our need to demonstrate that we understand the person we are joking with. My own ideas suggest we even have a desire to display skills of knowing what another person thinks – mind-reading in the scientific sense. For this, cultural alignment and an ability to display it are key elements in humour production and appreciation – it can make us joke differently with people from our own country than with people from other cultures.

‘Fork handles’.

For example, most people in the UK know that the popular phrase “don’t mention the war” refers to a Fawlty Towers sketch. Knowing that “fork handles” is funny also marks you as a UK citizen (see video above). Similarly, knowledge of “I Love Lucy” or quotes from Seinfeld create affiliation among many in the US, while reference to “Chavo del Ocho” or “Chapulín Colorado” do the same for Mexicans and most Latin Americans.

These shared cultural motifs – here drawn mostly from television – are one important aspect of a national sense of humour. They create a sense of belonging and camaraderie. They make us feel more confident about our humour and can be used to build further jokes on.

A broadly shared sense of humour is probably one of our best indicators for how assimilated we are as a nation. Indeed, a nation’s humour is more likely to show unity within a country than to display a nation as being different from other nations in any meaningful way.

Gary McKeown, Senior Lecturer of Psychology, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Younger is not always better when it comes to learning a second language


Image 20170224 32726 1xtuop0
Learning a language in a classroom is best for early teenagers.
from http://www.shutterstock.com

Warren Midgley, University of Southern Queensland

It’s often thought that it is better to start learning a second language at a young age. But research shows that this is not necessarily true. In fact, the best age to start learning a second language can vary significantly, depending on how the language is being learned. The Conversation

The belief that younger children are better language learners is based on the observation that children learn to speak their first language with remarkable skill at a very early age.

Before they can add two small numbers or tie their own shoelaces, most children develop a fluency in their first language that is the envy of adult language learners.

Why younger may not always be better

Two theories from the 1960s continue to have a significant influence on how we explain this phenomenon.

The theory of “universal grammar” proposes that children are born with an instinctive knowledge of the language rules common to all humans. Upon exposure to a specific language, such as English or Arabic, children simply fill in the details around those rules, making the process of learning a language fast and effective.

The other theory, known as the “critical period hypothesis”, posits that at around the age of puberty most of us lose access to the mechanism that made us such effective language learners as children. These theories have been contested, but nevertheless they continue to be influential.

Despite what these theories would suggest, however, research into language learning outcomes demonstrates that younger may not always be better.

In some language learning and teaching contexts, older learners can be more successful than younger children. It all depends on how the language is being learned.

Language immersion environment best for young children

Living, learning and playing in a second language environment on a regular basis is an ideal learning context for young children. Research clearly shows that young children are able to become fluent in more than one language at the same time, provided there is sufficient engagement with rich input in each language. In this context, it is better to start as young as possible.

Learning in classroom best for early teens

Learning in language classes at school is an entirely different context. The normal pattern of these classes is to have one or more hourly lessons per week.

To succeed at learning with such little exposure to rich language input requires meta-cognitive skills that do not usually develop until early adolescence.

For this style of language learning, the later years of primary school is an ideal time to start, to maximise the balance between meta-cognitive skill development and the number of consecutive years of study available before the end of school.

Self-guided learning best for adults

There are, of course, some adults who decide to start to learn a second language on their own. They may buy a study book, sign up for an online course, purchase an app or join face-to-face or virtual conversation classes.

To succeed in this learning context requires a range of skills that are not usually developed until reaching adulthood, including the ability to remain self-motivated. Therefore, self-directed second language learning is more likely to be effective for adults than younger learners.

How we can apply this to education

What does this tell us about when we should start teaching second languages to children? In terms of the development of language proficiency, the message is fairly clear.

If we are able to provide lots of exposure to rich language use, early childhood is better. If the only opportunity for second language learning is through more traditional language classes, then late primary school is likely to be just as good as early childhood.

However, if language learning relies on being self-directed, it is more likely to be successful after the learner has reached adulthood.

Warren Midgley, Associate Professor of Applied Linguistics, University of Southern Queensland

This article was originally published on The Conversation. Read the original article.

The world’s words of the year pass judgement on a dark, surreal 2016


efef.JPG

Philip Seargeant, The Open University

Every December, lexicographers around the world choose their “words of the year”, and this year, perhaps more than ever, the stories these tell provide a fascinating insight into how we’ve experienced the drama and trauma of the last 12 months.

There was much potential in 2016. It was 500 years ago that Thomas More wrote his Utopia, and January saw the launch of a year’s celebrations under the slogan “A Year of Imagination and Possibility” – but as 2017 looms, this slogan rings hollow. Instead of utopian dreams, we’ve had a year of “post-truth” and “paranoia”, of “refugee” crises, “xenophobia” and a close shave with “fascism”.

Earlier in the year, a campaign was launched to have “Essex Girl” removed from the Oxford English Dictionary (OED). Those behind the campaign were upset at the derogatory definition – a young woman “characterised as unintelligent, promiscuous, and materialistic” – so wanted it to be expunged from the official record of the language.

The OED turned down the request, a spokeswoman explaining that since the OED is a historical dictionary, nothing is ever removed; its purpose, she said, is to describe the language as people use it, and to stand as a catalogue of the trends and preoccupations of the time.

The words of the year tradition began with the German Wort des Jahres in the 1970s. It has since spread to other languages, and become increasingly popular the world over. Those in charge of the choices are getting more innovative: in 2015, for the first time, Oxford Dictionaries chose a pictograph as their “word”: the emoji for “Face with Tears of Joy”.

In 2016, however, the verbal was very much back in fashion. The results speak volumes.

Dark days

In English, there are a range of competing words, with all the major dictionaries making their own choices. Having heralded a post-language era last year, Oxford Dictionaries decided on “post-truth” this time, defining it as the situation when “objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. In a year of evidence-light Brexit promises and Donald Trump’s persistent lies and obfuscations, this has a definite resonance. In the same dystopian vein, the Cambridge Dictionary chose “paranoid”, while Dictionary.com went for “xenophobia”.

Merriam-Webster valiantly tried to turn back the tide of pessimism. When “fascism” looked set to win its online poll, it tweeted its readers imploring them to get behind something – anything – else. The plea apparently worked, and in the end “surreal” won the day. Apt enough for a year in which events time and again almost defied belief.

The referendum that spawned a thousand words.
EPA/Andy Rain

Collins, meanwhile, chose “Brexit”, a term which its spokesperson suggested has become as flexible and influential in political discourse as “Watergate”.

Just as the latter spawned hundreds of portmanteau words whenever a political scandal broke, so Brexit begat “Bremain”, “Bremorse” and “Brexperts” – and will likely be adapted for other upcoming political rifts for many years to come. It nearly won out in Australia in fact, where “Ausexit” (severing ties with the British monarchy or the United Nations) was on the shortlist. Instead, the Australian National Dictionary went for “democracy sausage” – the tradition of eating a barbecued sausage on election day.

Around the world, a similar pattern of politics and apprehension emerges. In France, the mot de l’année was réfugiés (refugees); and in Germany postfaktisch, meaning much the same as “post-truth”. Swiss German speakers, meanwhile, went for Filterblase (filter bubble), the idea that social media is creating increasingly polarised political communities.

Switzerland’s Deaf Association, meanwhile, chose a Sign of the Year for the first time. Its choice was “Trump”, consisting of a gesture made by placing an open palm on the top of the head, mimicking the president-elect’s extravagant hairstyle.

2016’s golden boy, as far as Japan’s concerned.
Albert H. Teich

Trump’s hair also featured in Japan’s choice for this year. Rather than a word, Japan chooses a kanji (Chinese character); 2016’s choice is “金” (gold). This represented a number of different topical issues: Japan’s haul of medals at the Rio Olympics, fluctuating interest rates, the gold shirt worn by singer and YouTube sensation Piko Taro, and, inevitably, the colour of Trump’s hair.

And then there’s Austria, whose word is 51 letters long: Bundespräsidentenstichwahlwiederholungsverschiebung. It means “the repeated postponement of the runoff vote for Federal President”. Referring to the seven months of votes, legal challenges and delays over the country’s presidential election, this again references an event that flirted with extreme nationalism and exposed the convoluted nature of democracy. As a new coinage, it also illustrates language’s endless ability to creatively grapple with unfolding events.

Which brings us, finally, to “unpresidented”, a neologism Donald Trump inadvertently created when trying to spell “unprecedented” in a tweet attacking the Chinese. At the moment, it’s a word in search of a meaning, but the possibilities it suggests seem to speak perfectly to the history of the present moment. And depending on what competitors 2017 throws up, it could well emerge as a future candidate.

The Conversation

Philip Seargeant, Senior Lecturer in Applied Linguistics, The Open University

This article was originally published on The Conversation. Read the original article.

The shelf-life of slang – what will happen to those ‘democracy sausages’?


sausage

Kate Burridge, Monash University

Every year around this time, dictionaries across the English-speaking world announce their “Word of the Year”. These are expressions (some newly minted and some golden oldies too) that for some reason have shot into prominence during the year.

Earlier this month The Australian National Dictionary Centre declared its winner “democracy sausage” – the barbecued snag that on election day makes compulsory voting so much easier to swallow.

Dictionaries make their selections in different ways, but usually it involves a combination of suggestions from the public and the editorial team (who have been meticulously tracking these words throughout the year). The Macquarie Dictionary has two selections – the Committee’s Choice made by the Word of the Year Committee, and the People’s Choice made by the public (so make sure you have your say on January 24 for the People’s Choice winner 2016).

It’s probably not surprising that these words of note draw overwhelmingly from slang language, or “slanguage” – a fall-out of the increasing colloquialisation of English usage worldwide. In Australia this love affair with the vernacular goes back to the earliest settlements of English speakers.

And now there’s the internet, especially social networking – a particularly fertile breeding ground for slang.

People enjoy playing with language, and when communicating electronically they have free rein. “Twitterholic”, “twaddiction”, “celebritweet/twit”, “twitterati” are just some of the “tweologisms” that Twitter has spawned of late. And with a reported average of 500 million tweets each day, Twitter has considerable capacity not only to create new expressions, but to spread them (as do Facebook, Instagram and other social networking platforms).

But what happens when slang terms like these make it into the dictionary? Early dictionaries give us a clue, particularly the entries that are stamped unfit for general use. Branded entries were certainly plentiful in Samuel Johnson’s 18th-century work, and many are now wholly respectable: abominably “a word of low or familiar language”, nowadays “barbarous usage”, fun “a low cant word” (what would Johnson have thought of very fun and funner?).

Since the point of slang is to mark an in-group, to amuse and perhaps even to shock outsiders with novelty, most slang expressions are short-lived. Those that survive become part of the mainstream and mundane. Quite simply, time drains them of their vibrancy and energy. J.M. Wattie put it more poetically back in 1930:

Slang terms are the mayflies of language; by the time they get themselves recorded in a dictionary, they are already museum specimens.

But, then again, expressions occasionally do sneak through the net. Not only do they survive, they stay slangy – and sometimes over centuries. Judge for yourselves. Here are some entries from A New and Comprehensive Vocabulary of the Flash Language. Written by British convict James Hardy Vaux in 1812, this is the first dictionary compiled in Australia.

croak “to die”

grub “food”

kid “deceive”

mug “face”

nuts on “to have a strong inclination towards something or someone”

on the sly “secretly”

racket “particular kind of fraud”

snitch “to betray”

stink “an uproar”

spin a yarn “tell a tale of great adventure”

These were originally terms of flash – or, as Vaux put it, “the cant language used by the family”. In other words, they belonged to underworld slang. The term slang itself meant something similar at this time; it broadened to highly colloquial language in the 1800s.

Vaux went on to point out that “to speak good flash is to be well versed in cant terms” — and, having been transported to New South Wales on three separate occasions during his “checkered and eventful life” (his words), Vaux himself was clearly well versed in the world of villainy and cant.

True, the majority of the slang terms here have dropped by the wayside (barnacles “spectacles”; lush “to drink”), and the handful that survives are now quite standard (grab “to seize”; dollop “large quantity”). But there are a few that have not only lasted, they’ve remained remarkably contemporary-sounding – some still even a little “disgraceful” (as Vaux described them).

The shelf-life of slang is a bit of mystery. Certainly some areas fray faster than others. Vaux’s prime, plummy and rum (meaning “excellent”) have well and truly bitten the dust. Cool might have made a comeback (also from the 1800s), but intensifiers generally wear out.

Far out and ace have been replaced by awesome, and there are plenty of new “awesome” words lurking in the wings. Some of these are already appearing on lists for “Most Irritating Word of the Year” – it’s almost as if their success does them in. Amazeballs, awesomesauce and phat are among the walking dead.

But as long as sausage sizzles continue to support Australian voters on election day, democracy sausages will have a place – and if adopted elsewhere, might even entice the politically uninterested into polling booths.

The Conversation

Kate Burridge, Professor of Linguistics, Monash University

This article was originally published on The Conversation. Read the original article.

20 misused English words that make smart people look silly by Dr. Travis Bradberry


We’re all tempted to use words that we’re not too familiar with. If this were the only problem, I wouldn’t have much to write about. That’s because we’re cautious with words we’re unsure of and, thus, they don’t create much of an issue for us. It’s the words that we think we’re using correctly that wreak the most havoc. We throw them around in meetings, e-mails and important documents (such as resumes and client reports), and they land, like fingernails across a chalkboard, on everyone who has to hear or read them. We’re all guilty of this from time to time, myself included.

When I write, I hire an editor who is an expert in grammar to review my articles before I post them online. It’s bad enough to have a roomful of people witness your blunder—it’s something else entirely to stumble in front of 100,000! The point is, we can all benefit from opportunities to sharpen the saw and minimize our mistakes. Often, it’s the words we perceive as being more correct or sophisticated that don’t really mean what we think they do. There are 20 such words that have a tendency to make even really smart people stumble.

Have a look to see which of these commonly confused words throw you off.

Accept vs. Except

These two words sound similar but have very different meanings. Accept means to receive something willingly: “His mom accepted his explanation” or “She accepted the gift graciously.” Except signifies exclusion: “I can attend every meeting except the one next week.” To help you remember, note that both except and exclusion begin with ex.

Affect vs. Effect

To make these words even more confusing than they already are, both can be used as either a noun or a verb. Let’s start with the verbs. Affect means to influence something or someone; effect means to accomplish something. “Your job was affected by the organizational restructuring” but “These changes will be effected on Monday.” As a noun, an effect is the result of something: “The sunny weather had a huge effect on sales.” It’s almost always the right choice because the noun affect refers to an emotional state and is rarely used outside of psychological circles: “The patient’s affect was flat.”

Lie vs. Lay

We’re all pretty clear on the lie that means an untruth. It’s the other usage that trips us up. Lie also means to recline: “Why don’t you lie down and rest?” Lay requires an object: “Lay the book on the table.” Lie is something you can do by yourself, but you need an object to lay. It’s more confusing in the past tense. The past tense of lie is—you guessed it—lay: “I lay down for an hour last night.” And the past tense of lay is laid: “I laid the book on the table.”

Bring vs. Take

Bring and take both describe transporting something or someone from one place to another, but the correct usage depends on the speaker’s point of view. Somebody brings something to you, but you take it to somewhere else: “Bring me the mail, then take your shoes to your room.” Just remember, if the movement is toward you, use bring; if the movement is away from you, use take.

Ironic vs. Coincidental

A lot of people get this wrong. If you break your leg the day before a ski trip, that’s not ironic—it’s coincidental (and bad luck). Ironic has several meanings, all of which include some type of reversal of what was expected. Verbal irony is when a person says one thing but clearly means another. Situational irony is when a result is the opposite of what was expected. O. Henry was a master of situational irony. In his famous short story The Gift of the Magi, Jim sells his watch to buy combs for his wife’s hair, and she sells her hair to buy a chain for Jim’s watch. Each character sold something precious to buy a gift for the other, but those gifts were intended for what the other person sold. That is true irony. If you break your leg the day before a ski trip, that’s coincidental.If you drive up to the mountains to ski, and there was more snow back at your house, that’s ironic.

Imply vs. Infer

To imply means to suggest something without saying it outright. To infer means to draw a conclusion from what someone else implies. As a general rule, the speaker/writer implies, and the listener/reader infers.

Nauseous vs. Nauseated

Nauseous has been misused so often that the incorrect usage is accepted in some circles. Still, it’s important to note the difference. Nauseous means causing nausea; nauseated means experiencing nausea. So, if your circle includes ultra-particular grammar sticklers, never say “I’m nauseous” unless you want them to be snickering behind your back.

Comprise vs. Compose

These are two of the most commonly misused words in the English language.Comprise means to include; compose means to make up. It all comes down to parts versus the whole. When you use comprise, you put the whole first: “A soccer game comprises (includes) two halves.” When you use compose, you put the pieces first: “Fifty states compose (make up) the United States of America.”

Farther vs. Further

Farther refers to physical distance, while further describes the degree or extent of an action or situation. “I can’t run any farther,” but “I have nothing further to say.” If you can substitute “more” or “additional,” use further.

Fewer vs. Less

Use fewer when you’re referring to separate items that can be counted; use less when referring to a whole: “You have fewer dollars, but less money.”

Bringing it all together

English grammar can be tricky, and, a lot of times, the words that sound right are actually wrong. With words such as those listed above, you just have to memorize the rules so that when you are about to use them, you’ll catch yourself in the act and know for certain that you’ve written or said the right one.