British literature is richly tangled with other histories and cultures – so why is it sold as largely white and English?


File 20171013 11689 mq52gv.jpg?ixlib=rb 1.1
Brick Lane: popularised in a novel by British writer, Monica Ali.
Shutterstock

Elleke Boehmer, University of Oxford and Erica Lombard, University of Oxford

Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.

In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.

In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).

Colourful misrepresentation

A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.

A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.

But it’s not just a case of under-representation. It’s also a case of misrepresentation.

Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.

 

These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.

Against these exclusions, leading British authors such as Bernardine Evaristo and others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.

Reframing the narrative

The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.

For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.

Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.

Bookish.
Shutterstock

Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.

There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.

The ConversationAll literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.

Elleke Boehmer, Professor of World Literature in English, University of Oxford and Erica Lombard, Postdoctoral Research Fellow, University of Oxford

This article was originally published on The Conversation. Read the original article.

 

Clear skies ahead: how improving the language of aviation could save lives


article-0-1A519FCE000005DC-280_964x629 (1).jpg

Dominique Estival, Western Sydney University

The most dangerous part of flying is driving to the airport.

That’s a standard joke among pilots, who know even better than the flying public that aviation is the safest mode of transportation.

But there are still those headlines and TV shows about airline crashes, and those statistics people like to repeat, such as:

Between 1976 and 2000, more than 1,100 passengers and crew lost their lives in accidents in which investigators determined that language had played a contributory role.

True enough, 80% of all air incidents and accidents occur because of human error. Miscommunication combined with other human factors such as fatigue, cognitive workload, noise, or forgetfulness have played a role in some of the deadliest accidents.

The most well-known, and widely discussed, is the collision on the ground of two Boeing 747 aircraft in 1977 in Tenerife, which resulted in 583 fatalities. The incident was due in part to difficult communications between the pilot, whose native language was Dutch, and the Spanish air traffic controller.

In such a high-stakes environment as commercial aviation, where the lives of hundreds of passengers and innocent people on the ground are involved, communication is critical to safety.

So, it was decided that Aviation English would be the international language of aviation and that all aviation professionals – pilots and air traffic controllers (ATC) – would need to be proficient in it. It is a language designed to minimise ambiguities and misunderstandings, highly structured and codified.

Pilots and ATC expect to hear certain bits of information in certain ways and in a given order. The “phraseology”, with its particular pronunciation (for example, “fife” and “niner” instead of “five” and “nine”, so they’re not confused with each other), specific words (“Cleared to land”), international alphabet (“Mike Hotel Foxtrot”) and strict conversation rules (you must repeat, or “read back”, an instruction), needs to be learned and practised.

In spite of globalisation and the spread of English, most people around the world are not native English speakers, and an increasing number of aviation professionals do not speak English as their first language.

Native speakers have an advantage when they learn Aviation English, since they already speak English at home and in their daily lives. But they encounter many pilots or ATC who learned English as a second or even third language.

Whose responsibility is it to ensure that communication is successful? Can native speakers simply speak the way they do at home and expect to be understood? Or do they also have the responsibility to make themselves understood and to learn how to understand pilots or ATC who are not native English speakers?

As a linguist, I analyse aviation language from a linguistics perspective. I have noted the restricted meaning of the few verbs and adjectives; that the only pronouns are “you” and sometimes “we” (“How do you read?”; “We’re overhead Camden”; how few questions there are, mostly imperatives (“Maintain heading 180”); and that the syntax is so simple (no complement clauses, no relative clauses, no recursion), it might not even count as a human language for Chomsky.

But, as a pilot and a flight instructor, I look at it from the point of view of student pilots learning to use it in the cockpit while also learning to fly the airplane and navigate around the airfield.

How much harder it is to remember what to say when the workload goes up, and more difficult to speak over the radio when you know everyone else on the frequency is listening and will notice every little mistake you make?

Imagine, then, how much more difficult this is for pilots with English as a second language.

Camden Airport.
Supplied

Everyone learning another language knows it’s suddenly more challenging to hold a conversation over the phone than face-to-face, even with someone you already know. When it’s over the radio, with someone you don’t know, against the noise of the engine, static noise in the headphones, and while trying to make the plane do what you want it to do, it can be quite daunting.

No wonder student pilots who are not native English speakers sometimes prefer to stay silent, and even some experienced native English speakers will too, when the workload is too great.

This is one of the results of my research conducted in collaboration with UNSW’s Brett Molesworth, combining linguistics and aviation human factors.

Experiments in a flight simulator with pilots of diverse language backgrounds and flying experience explored conditions likely to result in pilots making mistakes or misunderstanding ATC instructions. Not surprisingly, increased workload, too much information, and rapid ATC speech, caused mistakes.

Also not surprisingly, less experienced pilots, no matter their English proficiency, made more mistakes. But surprisingly, it was the level of training, rather than number of flying hours or language background, that predicted better communication.

Once we understand the factors contributing to miscommunication in aviation, we can propose solutions to prevent them. For example, technologies such as Automatic Speech Recognition and Natural Language Understanding may help catch errors in pilot readbacks that ATC did not notice and might complement training for pilots and ATC.

It is vital that they understand each other, whatever their native language.

The Conversation

Dominique Estival, Researcher in Linguistics, Western Sydney University

This article was originally published on The Conversation. Read the original article.

Beware the bad big wolf: why you need to put your adjectives in the right order


image-20160906-25260-dcj9cp.jpg

Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

 

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

Rules, rules, rules

Nevertheless, however many adjectives we choose to employ, they do indeed tend to follow a predictable pattern. While native speakers intuitively follow this rule, most are unaware that they are doing so; we agree that the “red big dog” sounds wrong, but don’t know why. In order to test this intuition linguists have analysed large corpora of electronic data, to see how frequently pairs of adjectives like “big red” are preferred to “red big”. The results confirm our native intuition, although the figures are not as comprehensive as we might expect – the rule accounts for 78% of the data.

We know how to use them … without even being aware of it.
Shutterstock

But while linguists have been able to confirm that there are strong preferences in the ordering of pairs of adjectives, no such statistics have been produced for longer strings. Consequently, while Forsyth’s rule appears to make sense, it remains an untested, hypothetical, large, sweeping (sorry) claim.

In fact, even if we stick to just two adjectives it is possible to find examples that appear to break the rule. The “big bad wolf” of fairy tale, for instance, shows the size adjective preceding the opinion one; similarly, “big stupid” is more common than “stupid big”. Examples like these are instead witness to the “Polyanna Principle”, by which speakers prefer to present positive, or indifferent, values before negative ones.

Another consideration of Forsyth’s proposed ordering sequence is that it makes no reference to other constraints that influence adjective order, such as when we use two adjectives that fall into the same category. Little Richard’s song “Long Tall Sally” would have sounded strange if he had called it Tall Long Sally, but these are both adjectives of size.

Definitely not Tall Long Sally.

Similarly, we might describe a meal as “nice and spicy” but never “spicy and nice” – reflecting a preference for the placement of general opinions before more specific ones. We also need to bear in mind the tendency for noun phrases to become lexicalised – forming words in their own right. Just as a blackbird is not any kind of bird that is black, a little black dress does not refer to any small black dress but one that is suitable for particular kinds of social engagement.

Since speakers view a “little black dress” as a single entity, its order is fixed; as a result, modifying adjectives must precede little – a “polyester little black dress”. This means that an adjective specifying its material appears before those referring to size and colour, once again contravening Forsyth’s rule.

Making sense of language

Of course, the rule is a fair reflection of much general usage – although the reasons behind this complex set of constraints in adjective order remain disputed. Some linguists have suggested that it reflects the “nouniness” of an adjective; since colour adjectives are commonly used as nouns – “red is my favourite colour” – they appear close to that slot.

Another conditioning factor may be the degree to which an adjective reflects a subjective opinion rather than an objective description – therefore, subjective adjectives that are harder to quantify (boring, massive, middle-aged) tend to appear further away from the noun than more concrete ones (red, round, French).

Prosody, the rhythm and sound of poetry, is likely to play a role, too – as there is a tendency for speakers to place longer adjectives after shorter ones. But probably the most compelling theory links adjective position with semantic closeness to the noun being described; adjectives that are closely related to the noun in meaning, and are therefore likely to appear frequently in combination with it, are placed closest, while those that are less closely related appear further away.

In Forsyth’s example, it is the knife’s whittling capabilities that are most significant – distinguishing it from a carving, fruit or butter knife – while its loveliness is hardest to define (what are the standards for judging the loveliness of a whittling knife?) and thus most subjective. Whether any slight reorganisation of the other adjectives would really prompt your friends to view you as a knife-wielding maniac is harder to determine – but then, at least it’s just a whittling knife.

The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

Britain may be leaving the EU, but English is going nowhere


image-20160701-18331-1oy1oep

Andrew Linn, University of Westminster

After Brexit, there are various things that some in the EU hope to see and hear less in the future. One is Nigel Farage. Another is the English language.

In the early hours of June 24, as the referendum outcome was becoming clear, Jean-Luc Mélenchon, left-wing MEP and French presidential candidate, tweeted that “English cannot be the third working language of the European parliament”.

This is not the first time that French and German opinion has weighed in against alleged disproportionate use of English in EU business. In 2012, for example, a similar point was made about key eurozone recommendations from the European Commission being published initially “in a language which [as far as the Euro goes] is only spoken by less than 5m Irish”. With the number of native speakers of English in the EU set to drop from 14% to around 1% of the bloc’s total with the departure of the UK, this point just got a bit sharper.

Translation overload

Official EU language policy is multilingualism with equal rights for all languages used in member states. It recommends that “every European citizen should master two other languages in addition to their mother tongue” – Britain’s abject failure to achieve this should make it skulk away in shame.

The EU recognises 24 “official and working” languages, a number that has mushroomed from the original four (Dutch, French, German and Italian) as more countries have joined. All EU citizens have a right to access EU documents in any of those languages. This calls for a translation team numbering around 2,500, not to mention a further 600 full-time interpreters. In practice most day-to-day business is transacted in either English, French or German and then translated, but it is true that English dominates to a considerable extent.

Lots of work still to do.
Etienne Ansotte/EPA

The preponderance of English has nothing to do with the influence of Britain or even Britain’s membership of the EU. Historically, the expansion of the British empire, the impact of the industrial revolution and the emergence of the US as a world power have embedded English in the language repertoire of speakers across the globe.

Unlike Latin, which outlived the Roman empire as the lingua franca of medieval and renaissance Europe, English of course has native speakers (who may be unfairly advantaged), but it is those who have learned English as a foreign language – “Euro-English” or “English as a lingua franca” – who now constitute the majority of users.

According to the 2012 Special Eurobarometer on Europeans and their Languages, English is the most widely spoken foreign language in 19 of the member states where it is not an official language. Across Europe, 38% of people speak English well enough as a foreign language to have a conversation, compared to 12% speaking French and 11% in German.

The report also found that 67% of Europeans consider English the most useful foreign language, and that the numbers favouring German (17%) or French (16%) have declined. As a result, 79% of Europeans want their children to learn English, compared to 20% for French and German.

Too much invested in English

Huge sums have been invested in English teaching by both national governments and private enterprise. As the demand for learning English has increased, so has the supply. English language learning worldwide was estimated to be worth US$63.3 billion (£47.5 billion) in 2012, and it is expected that this market will rise to US$193.2 billion (£145.6 billion) by 2017. The value of English for speakers of other languages is not going to diminish any time soon. There is simply too much invested in it.

Speakers of English as a second language outnumber first-language English speakers by 2:1 both in Europe and globally. For many Europeans, and especially those employed in the EU, English is a useful piece in a toolbox of languages to be pressed into service when needed – a point which was evident in a recent project on whether the use of English in Europe was an opportunity or a threat. So in the majority of cases using English has precisely nothing to do with the UK or Britishness. The EU needs practical solutions and English provides one.

English is unchallenged as the lingua franca of Europe. It has even been suggested that in some countries of northern Europe it has become a second rather than a foreign language. Jan Paternotte, D66 party leader in Amsterdam, has proposed that English should be decreed the official second language of that city.

English has not always held its current privileged status. French and German have both functioned as common languages for high-profile fields such as philosophy, science and technology, politics and diplomacy, not to mention Church Slavonic, Russian, Portuguese and other languages in different times and places.

We can assume that English will not maintain its privileged position forever. Who benefits now, however, are not the predominantly monolingual British, but European anglocrats whose multilingualism provides them with a key to international education and employment.

Much about the EU may be about to change, but right now an anti-English language policy so dramatically out of step with practice would simply make the post-Brexit hangover more painful.

The Conversation

Andrew Linn, Pro-Vice-Chancellor and Dean of Social Sciences and Humanities, University of Westminster

This article was originally published on The Conversation. Read the original article.

How other languages can reveal the secrets to happiness


image-20160627-28362-1djxdon

Tim Lomas, University of East London

The limits of our language are said to define the boundaries of our world. This is because in our everyday lives, we can only really register and make sense of what we can name. We are restricted by the words we know, which shape what we can and cannot experience.

It is true that sometimes we may have fleeting sensations and feelings that we don’t quite have a name for – akin to words on the “tip of our tongue”. But without a word to label these sensations or feelings they are often overlooked, never to be fully acknowledged, articulated or even remembered. And instead, they are often lumped together with more generalised emotions, such as “happiness” or “joy”. This applies to all aspects of life – and not least to that most sought-after and cherished of feelings, happiness. Clearly, most people know and understand happiness, at least vaguely. But they are hindered by their “lexical limitations” and the words at their disposal.

As English speakers, we inherit, rather haphazardly, a set of words and phrases to represent and describe our world around us. Whatever vocabulary we have managed to acquire in relation to happiness will influence the types of feelings we can enjoy. If we lack a word for a particular positive emotion, we are far less likely to experience it. And even if we do somehow experience it, we are unlikely to perceive it with much clarity, think about it with much understanding, talk about it with much insight, or remember it with much vividness.

Speaking of happiness

While this recognition is sobering, it is also exciting, because it means by learning new words and concepts, we can enrich our emotional world. So, in theory, we can actually enhance our experience of happiness simply through exploring language. Prompted by this enthralling possibility, I recently embarked on a project to discover “new” words and concepts relating to happiness.

I did this by searching for so-called “untranslatable” words from across the world’s languages. These are words where no exact equivalent word or phrase exists in English. And as such, suggest the possibility that other cultures have stumbled upon phenomena that English-speaking places have somehow overlooked.

Perhaps the most famous example is “Schadenfreude”, the German term describing pleasure at the misfortunes of others. Such words pique our curiosity, as they appear to reveal something specific about the culture that created them – as if German people are potentially especially liable to feelings of Schadenfreude (though I don’t believe that’s the case).

German’s are no more likely to experience Schadenfreude than they are to drink steins of beer in Bavarian costume.
Kzenon/Shutterstock

However, these words actually may be far more significant than that. Consider the fact that Schadenfreude has been imported wholesale into English. Evidently, English speakers had at least a passing familiarity with this kind of feeling, but lacked the word to articulate it (although I suppose “gloating” comes close) – hence, the grateful borrowing of the German term. As a result, their emotional landscape has been enlivened and enriched, able to give voice to feelings that might previously have remained unconceptualised and unexpressed.

My research, searched for these kind of “untranslatable words” – ones that specifically related to happiness and well-being. And so I trawled the internet looking for relevant websites, blogs, books and academic papers, and gathered a respectable haul of 216 such words. Now, the list has expanded – partly due to the generous feedback of visitors to my website – to more than 600 words.

Enriching emotions

When analysing these “untranslatable words”, I divide them into three categories based on my subjective reaction to them. Firstly, there are those that immediately resonate with me as something I have definitely experienced, but just haven’t previously been able to articulate. For instance, I love the strange German noun “Waldeinsamkeit”, which captures that eerie, mysterious feeling that often descends when you’re alone in the woods.

A second group are words that strike me as somewhat familiar, but not entirely, as if I can’t quite grasp their layers of complexity. For instance, I’m hugely intrigued by various Japanese aesthetic concepts, such as “aware” (哀れ), which evokes the bitter-sweetness of a brief, fading moment of transcendent beauty. This is symbolised by the cherry blossom – and as spring bloomed in England I found myself reflecting at length on this powerful yet intangible notion.

Finally, there is a mysterious set of words which completely elude my grasp, but which for precisely that reason are totally captivating. These mainly hail from Eastern religions – terms such as “Nirvana” or “Brahman” – which translates roughly as the ultimate reality underlying all phenomena in the Hindu scriptures. It feels like it would require a lifetime of study to even begin to grasp the meaning – which is probably exactly the point of these types of words.

Now we can all ‘tepils’ like the Norwegians – that’s drink beer outside on a hot day, to you and me
Africa Studio/Shutterstock

I believe these words offer a unique window onto the world’s cultures, revealing diversity in the way people in different places experience and understand life. People are naturally curious about other ways of living, about new possibilities in life, and so are drawn to ideas – like these untranslatable words – that reveal such possibilities.

There is huge potential for these words to enrich and expand people’s own emotional worlds, with each of these words comes a tantalising glimpse into unfamiliar and new positive feelings and experiences. And at the end of the day, who wouldn’t be interested in adding a bit more happiness to their own lives?

The Conversation

Tim Lomas, Lecturer in Applied Positive Psychology , University of East London

This article was originally published on The Conversation. Read the original article.

WHERE HAVE ALL THE ADULT STUDENTS GONE?


college-students

The EFL industry in Spain enjoyed a mini boom during the early years of the global economic crisis as many adult students rushed to improve their English language skills, either to get themselves back into the job market, or else in an attempt to hang on the job they had. As we reached the new decade, the boom slowed down and then started to tail-off. But no-one expected the sudden and significant drop in adult student numbers that hit the industry at the start of the current academic year.

The drop wasn’t school, city, or even region specific; it was the same story all over Spain. And the numbers were eye-watering. Depending who you talk to (and/or who you believe) adult student numbers fell by between 10-20%. Enough to make any school owner or manager wince.

What happened? Where did all these students go? Well, as is normally the case, there is no one, simple answer. There has been a slight upturn in in-company teaching, so it may be that some students, who were previously paying for their own courses in our schools, are now studying in their company (if they’re fortunate to have a job in the first place; Spanish unemployment is still well over 20%.)

The standard of English teaching in main-stream education is also getting better, slowly, so it may be that there are more school leavers who have achieved a basic level of communicative competence.

Some adult students – especially the younger ones – may also have decided to switch from a traditional, bricks and mortar language school to a Web-based classroom.

My own theory is that it’s the free movement of labour in the European Union which is having the greatest effect on our market. In other words, as there so few jobs available in Spain, hundreds of thousands of young adults – many of whom may previously have been our students – have simply upped sticks and gone abroad to find work.

A recent survey conducted in the UK indicates that migrants from Spain rose to 137,000 in 2015 (up from 63,000 in 2011). Most of them are probably working in relatively unskilled jobs in hotels, bars and restaurants, but at least they’re working – and they’re improving their English language skills as they go.

A similar number probably emigrated to other countries in the north of Europe and another significant number emigrated to Latin America. Add up all these emigrants and we could be looking at a total of well over 300,000 migrants – just in 2015.

On a recent trip to Oxford I met a young Spanish guy, working in a hotel, who had previously been a student at our school in Barcelona. He’s a typical example. Will he ever move back to Spain, I asked him? Perhaps, in the future, he said, but only if the situation in Spain changes and he can find a decent job. His new fluency in English, learnt by living and working in Oxford, might just help him with that.

So where does that leave Spanish language schools? Will adult students come back to our schools in the same numbers as before? Probably not. But that doesn’t mean we have to give up on this market. If adult students won’t come to us, we can use the Internet to take our services to them. Even those living and working abroad.

 

This article was written by Jonathan Dykes – His Blog page can be found here:- https://jonathandykesblog.wordpress.com/2016/06/10/where-have-all-the-adult-students-gone/

Could early music training help babies learn language?


image-20160512-16410-1i0hrpb.jpg

Christina Zhao, University of Washington

Growing up in China, I started playing piano when I was nine years old and learning English when I was 12. Later, when I was a college student, it struck me how similar language and music are to each other.

Language and music both require rhythm; otherwise they don’t make any sense. They’re also both built from smaller units – syllables and musical beats. And the process of mastering them is remarkably similar, including precise movements, repetitive practice and focused attention. I also noticed that my musician peers were particularly good at learning new languages.

All of this made me wonder if music shapes how the brain perceives sounds other than musical notes. And if so, could learning music help us learn languages?

Music experience and speech

Music training early in life (before the age of seven) can have a wide range of benefits beyond musical ability.

For instance, school-age children (six to eight years old) who participated in two years of musical classes four hours each week showed better brain responses to consonants compared with their peers who started one year later. This suggests that music experience helped children hear speech sounds.

Music may have a range of benefits.
Breezy Baldwin, CC BY

But what about babies who aren’t talking yet? Can music training this early give babies a boost in the steps it takes to learn language?

The first year of life is the best time in the lifespan to learn speech sounds; yet no studies have looked at whether musical experience during infancy can improve speech learning.

I sought to answer this question with Patricia K. Kuhl, an expert in early childhood learning. We set out to study whether musical experience at nine months of age can help infants learn speech.

Nine months is within the peak period for infants’ speech sound learning. During this time, they’re learning to pay attention to the differences among the different speech sounds that they hear in their environment. Being able to differentiate these sounds is key for learning to speak later. A better ability to tell speech sounds apart at this age is associated with producing more words at 30 months of age.

Here is how we did our study

In our study, we randomly put 47 nine-month-old infants in either a musical group or a control group and completed 12 15-minute-long sessions of activities designed for that group.

Babies in the music group sat with their parents, who guided them through the sessions by tapping out beats in time with the music with the goal of helping them learn a difficult musical rhythm.

Here is a short video demonstration of what a music session looked like.

Infants in the control group played with toy cars, blocks and other objects that required coordinated movements in social play, but without music.

After the sessions, we measured the babies’ brains responses to musical and speech rhythms using magnetoencephalography (MEG), a brain imaging technique.

New music and speech sounds were presented in rhythmic sequences, but the rhythms were occasionally disrupted by skipping a beat.

These rhythmic disruptions help us measure how well the babies’ brains were honed to rhythms. The brain gives a specific response pattern when detecting an unexpected change. A bigger response indicates that the baby was following rhythms better.

Babies in the music group had stronger brain responses to both music and speech sounds compared with babies in the control group. This shows that musical experience, as early as nine month of age, improved infants’ ability to process both musical and speech rhythms.

These skills are important building blocks for learning to speak.

Other benefits from music experience

Language is just one example of a skill that can be improved through music training. Music can help with social-emotional development, too. An earlier study by researchers Tal-Chen Rabinowitch and Ariel Knafo-Noam showed that pairs of eight-year-olds who didn’t know each other reported feeling more close and connected with one another after a short exercise of tapping out beats in sync with each other.

Music helps children bond better.
Boy image via www.shutterstock.com

Another researcher, Laura Cirelli, showed that 14-month-old babies were more likely to show helping behaviors toward an adult after the babies had been bounced in sync with the adult who was also moving rhythmically.

There are many more exciting questions that remain to be answered as researchers continue to study the effects of music experience on early development.

For instance, does the music experience need to be in a social setting? Could babies get the benefits of music from simply listening to music? And, how much experience do babies need over time to sustain this language-boosting benefit?

Music is an essential part of being human. It has existed in human cultures for thousands of years, and it is one of the most fun and powerful ways for people to connect with each other. Through scientific research, I hope we can continue to reveal how music experience influences brain development and language learning of babies.

The Conversation

Christina Zhao, Postdoctoral Fellow, University of Washington

This article was originally published on The Conversation. Read the original article.

Adorbs new words will only really join the English language when we see them in print


7nc66jbc-1408031026

Gillian Rudd, University of Liverpool

Listicle: an article made up of lists. This may be regarded as
Bare lazy as it obviates the need for coherent paragraphs, or as
Douchebaggery, if it’s taken to be
Clickbait.

The temptation to create a listicle in response to the latest raft of words to win a place in the Oxford English Dictionary is great, but to be resisted. These latest additions do not simply show the love of creating words – this is something that has always happened in responses to changes in life and attitudes.

Instead, it demonstrates our current ability to promulgate such nonsense words, allowing them to gain sudden currency, perhaps through “trending”, to make use of another relative newcomer to the fold, or “retweeting”. (These are not to be confused with the newcomer “subtweets” which themselves perpetuate another long tradition in English – that of making snide remarks through indirect allusion in a public arena. Alexander Pope would have been a great subtweeter.)

Some of the newly accepted words make one of the main processes of linguistic evolution clear: that of creating a new word by analogy with one already in use. “Binge-watching” is the clearest example. This is the viewing of several episodes or indeed whole series of a televised drama in one sitting. This word is clearly created by analogy with “binge-drinking”, which came to replace the phrase “going on a binge” or “going on a bender” when referring to drinking large amounts of alcohol over a short space of time.

Yes, there’s a difference here – where the earlier two phrases indicated that the occurrence was infrequent, if not actually unusual, “binge-drinking” is habitual, normally taking place at weekends, much as “binge-watching” does for many. I’d like to think that “binge-browsing” might be next, with the specific meaning of spending hours browsing the OED site when one visited to look up just one word. But possibly this is not a habit to encourage, after all, ”YOLO“.

Such changes always provoke reaction. Reliably, this varies from outrage at the abuse of language and ignorance of etymological development that such words betray, to celebration of English as a language flexible enough to admit such vibrant new forms and accommodate the creativity of its users.

But what’s interesting to me, as someone whose most frequent uses of dictionaries are to correct spelling and check historical usage, is the way that great institution, the Oxford English Dictionary, is able to satisfy two roles at once. This is thanks to its dual format – in print and online. It’s the online version that will soon include “listicle” and the rest, with no guarantee that these words will make it into the next print version (assuming there is one, which is what the current distinction between print and online versions implies).

This allows for the OED to record passing uses and trends without compromising its role as final arbitrator on whether or not a word can be said to have entered the English language. This is, after all, a decision which to a large extent depends on proving that word not only gained currency but retained a decent, level of recorded usage over a period of time and, crucially, in print.

And so print retains its sense of permanence in the face of ephemeral but ubiquitous electronic media. Or apparently ephemeral. The recent ruling requiring Google in particular to “remove” records from the internet has reminded us that it is in fact all but impossible to delete anything committed to the electronic ether – however paradoxical that seems. It’s all still out there, it’s just no longer appearing in the search results.

Googling itself is a word now accepted by the online OED, and while at first its currency was an indicator of the success of the company, it’s interesting to speculate on the survival of the word should Google itself go under, or lose its predominant position. Would we then all revert to “web-searching” for background information, or would we google, just as we hoover, forgetful the fact that the common verb once indicated a specific, dominant company?

Only the print version of the OED will tell.

The Conversation

Gillian Rudd, Professor in English Literature, University of Liverpool

This article was originally published on The Conversation. Read the original article.

What’s the point of education if Google can tell us anything?


gapps-edu

Ibrar Bhatt, Lancaster University

Can’t remember the name of the two elements that scientist Marie Curie discovered? Or who won the 1945 UK general election? Or how many light years away the sun is from the earth? Ask Google.

Constant access to an abundance of online information at the click of a mouse or tap of a smartphone has radically reshaped how we socialise, inform ourselves of the world around us and organise our lives. If all facts can be summoned instantly by looking online, what’s the point of spending years learning them at school and university? In the future, it might be that once young people have mastered the basics of how to read and write, they undertake their entire education merely through accessing the internet via search engines such as Google, as and when they want to know something.

Some educational theorists have argued that you can replace teachers, classrooms, textbooks and lectures by simply leaving students to their own devices to search and collect information about a particular topic online. Such ideas have called into question the value of a traditional system of education, one in which teachers simply impart knowledge to students. Of course, others have warned against the dangers of this kind of thinking and the importance of the teacher and human contact when it comes to learning.

Such debate about the place and purpose of online searching in learning and assessments is not new. But rather than thinking of ways to prevent students from cheating or plagiarising in their assessed pieces of work, maybe our obsession with the “authenticity” of their coursework or assessment is missing another important educational point.

Digital content curators

In my recent research looking at the ways students write their assignments, I found that increasingly they may not always compose written work which is truly “authentic”, and that this may not be as important as we think. Instead, through prolific use of the internet, students engaged in a number of sophisticated practices to search, sift, critically evaluate, anthologise and re-present pre-existing content. Through a close examination of the moment-by-moment work of the way students write assignments, I came to see how all the pieces of text students produced contained elements of something else. These practices need to be better understood and then incorporated into new forms of education and assessment.

These online practices are about harnessing an abundance of information from a multitude of sources, including search engines like Google, in what I call a form of “digital content curation”. Curation in this sense is about how learners use existing content to produce new content through engaging in problem-solving and intellectual inquiry, and creating a new experience for readers.

Lessons in how to search.
Students via bikeriderlondon/www.shutterstock.com

Part of this is developing a critical eye about what’s being searched for online, or “crap-detection”, whilst wading through the deluge of available information. This aspect is vital to any educationally serious notion of information curation, as learners increasingly use the web as extensions of their own memory when searching.

Students must begin by understanding that most online content is already curated by search engines like Google using their PageRank algorithm and other indicators. Curation, therefore, becomes a kind of stewardship of other people’s writing and requires entering into a conversation with the writers of those texts. It is a crucial kind of ‘digital literacy’

Curation has, through pervasive connectivity, found its way into educational contexts. There is now a need to better understand how practices of online searching and the kinds of writing emerging from curation can be incorporated into the way we assess students.

How to assess these new skills

While writing for assessment tends to focus on the production of a student’s own, “authentic” work, it could also take curation practices into account. Take, for example, a project designed as a kind of digital portfolio. This could require students to locate information on a particular question, organise existing web extracts in a digestible and story-like way, acknowledge their sources, and present an argument or thesis.

Solving problems through synthesising large amounts of information, often collaboratively, and engaging in exploratory and problem-solving pursuits (rather than just memorising facts and dates) are key skills in the 21st century, information-based economy. As the London Chamber of Commerce has highlighted, we must make sure young people and graduates enter employment with these skills.

My own research has shown that young people may already be expert curators as part of their everyday internet experience and surreptitious assignment writing strategies. Teachers and lecturers need to explore and understand these practices better, and create learning opportunities and academic assessment tasks around these somewhat “hard to assess” skills.

In an era of informational abundance, educational end-products – the exam or piece of coursework – need to become less about a single student creating an “authentic” text, and more about a certain kind of digital literacy which harnesses the wisdom of the network of information that is available at the click of a button.

The Conversation

Ibrar Bhatt, Senior Research Associate, Lancaster University

This article was originally published on The Conversation. Read the original article.

Are youse using English properly – or mangling your native tongue?


We long ago lost our second person plural – but that hasn’t stopped us adapting.

We long ago lost our second person plural – but that hasn’t stopped us adapting.

Rob Pensalfini, The University of Queensland

Languages evolve and transform. If that weren’t the case, the only word in the previous sentence that would be considered English is and (which in any case used to mean if). The English we speak would not be remotely comprehensible to Geoffrey Chaucer, who wrote The Canterbury Tales some 600 years ago.

Contemporary accents in particular would sound very foreign to Shakespeare’s ears, and the grammatical structure of the language has changed in subtle ways in the 400 years since he died.

For the most part, those changes don’t affect the expressiveness of the language or the ease of making certain important distinctions in speech and writing. Yet language-change is not consciously guided: it’s unpredictable and sometimes chaotic. So what if language change gets it “wrong”?

Contemporary Standard Englishes (UK, USA, Australian, NZ, SA etc) distinguish singular from plural for all nouns and pronouns, with a few exceptions:

The few exceptions among nouns – such as “sheep” – rarely, if ever, cause confusion or lack of clarity. The problematic case is the second person pronoun “you”. All the other pronouns not only vary from singular to plural, but also generally have distinct forms that vary for “case” or – put simplistically – whether the word is the subject or object of the sentence:

“I love language” versus “language fascinates me”.

The second person is simply you, whether singular or plural, subject or object. But that wasn’t always the situation. As recently as 400 ago, second person pronouns were as follows:

You took over as the plural form for both subject and object, but then eventually also supplanted the singular forms, so that we now no longer can be certain whether sentences such as “I need you to help me” is directed to one person in a group or the whole group.

We can of course get round it by adding phrases such as “You, with the blue shirt” or “you boys,” but compared to the elegant thou versus you this is clunky, and the verbiage almost defeats the advantage of having a pronoun, a shortcut to reference, in the first place. It’s a very useful distinction.

How on earth did we lose it?

What art thou staring at?
Wikimedia Commons

My favourite hypothesis is that it fell victim to the increasing taste for formality in English-speaking society in the 17th through 19th centuries. You in Shakespeare’s day was not only used for the plural, but could be used to address a single person in a formal context – usually if the person was of a higher social status or rank than the speaker, or if they were a stranger of presumably equal rank.

The use of you to a singular person indicated a kind of deference and social distance, and was formal in tone. One might say “I have brought thee a cabbage” to one’s brother or friend, but “I have brought you a cabbage” to a king, bishop, or employer (unless on intimate terms).

Many languages, such as French, still do this – they maintain a distinction between singular and plural second person, but use the plural form (vous) to a single person to indicate politeness or formality.

When I first read Pride and Prejudice, I was astonished by Mr and Mrs Bennett, married for decades, alone at a breakfast table, addressing one another as Mr Bennett and Mrs Bennett. It’s not outlandish as an expression of endearment (as some couples use Mum and Dad to one another), but we can presume that a writer as astute as Jane Austen would have been reflecting social concerns and trends.

From my non-expert reading of the history of these times, it seems the level of formality increased in all interactions, even the most intimate, after the Renaissance, reaching a zenith in the Regency and Victorian eras.

Elgin County Archives

People would have used the formal second person you in more and more contexts, and the familiar/intimate thou less, until a tipping point was reached and the singular forms disappeared entirely.

Contemporary English-speaking societies have retreated from that level of formality. Even the most formal interactions, such as job interviews and audiences with dignitaries, are far more casual than they were 200 years ago. Plus, we lost our means of distinguishing with a mere word whom exactly we were addressing.

That’s why, independently in many varieties of English around the world, the distinction has been re-introduced. Not by the resurrection of thou, but by keeping you as the singular, and introducing a new plural such as youse (Australia, NZ, SA, Ireland, Scotland), yinz (Pittsburgh, parts of UK) and y’all (US South, West Indies, Alberta).

Tony Fischer Photography

No committee approved it. Some folks starting using it and, because it filled a need, it spread. Once an old form such as thou has disappeared from a language, it is unlikely to return even if a need for it arises.

Rather, speakers will use the available resources of the living language to innovate. So youse (or yous) is simply a regular “add an ‘s’” plural, y’all is a contraction of the phrase you all, and yinz appears to be a contraction of you ones.

In some places the phrasal you(s) guys is used, and in Kriol, an Aboriginal language of the Northern Territory, the plural yumob comes from you mob.

So, will this very useful innovation become standard? That’s impossible to predict, but we know that many people react negatively to any linguistic innovation, especially one that arises from non-Standard varieties.

The paradox of this prescriptivism is this: most prescriptivists don’t want to see the attrition of a language’s expressivity and nuance. But prescriptivism rarely prevents the disappearance of forms and structures. It didn’t save thou. But what it may hamper is the arrival or spread of innovations.

Prescriptivism doesn’t like to let stuff in, but it’s no good at stopping stuff from falling out.

The Conversation

Rob Pensalfini, Senior Lecturer, School of English, Media Studies and Art History, The University of Queensland

This article was originally published on The Conversation. Read the original article.