Recent global developments have sharply polarised communities in many countries around the world. A new politics of exclusion has drawn urgent attention to the ways in which structural inequality has marginalised and silenced certain sectors of society. And yet, as a recent report shows, diversity and inclusion in fact “benefit the common good”. A more diverse group is a stronger, more creative and productive group.
In the world of literary writing, we find similar gaps and exclusions. But these are counterbalanced in some respects by new positive initiatives.
In 2015, a study revealed that literature by writers of colour had been consistently under-represented by the predominantly white British book industry. Statistics in The Bookseller show that out of thousands of books published in 2016 in the UK, fewer than 100 were by British authors of a non-white background. And out of 400 authors identified by the British public in a 2017 Royal Society of Literature survey, only 7% were black, Asian or of mixed race (compared to 13% of the population).
A similar marginalisation takes place in the curricula in schools and universities, mirroring exclusions in wider society. In most English literature courses of whatever period, the writers taught are white, largely English and largely male.
A fundamental inequality arises in which, though British culture at large is diverse, syllabuses are not. Indeed, many British readers and students find little to recognise or to identify with when they read and study mainstream British literature.
But it’s not just a case of under-representation. It’s also a case of misrepresentation.
Black and Asian writers who have been published within the mainstream British system describe the pressure they have felt to conform to cultural stereotypes in their work. Their books are often packaged and presented in ways that focus on their ethnicity, regularly using cliches. At the same time, more universal aspects of their writing are overlooked. For example, the covers of novels by Asian British writers usually stick to a limited colour palette of yellows, reds, and purples, accented by “exotic” images.
These writers bristle at the sense that they are read not as crafters of words and worlds, but as spokespeople for their communities or cultures. At its worst, this process turns these writers and their books into objects of anthropological curiosity rather than works inviting serious literary study or simply pleasurable reading. The message is that black and Asian literature is other than or outside mainstream British writing.
Against these exclusions, leading British authors such as Bernardine Evaristoand others have urged for a broader, more inclusive approach. They recognise that what and how we read shapes our sense of ourselves, our communities and the world.
Reframing the narrative
The Postcolonial Writers Make Worlds research project, based in the Oxford English Faculty and The Oxford Research Centre in the Humanities, set out to ask what it means to read contemporary fiction as British readers. Working with reading groups and in discussion with writers, we found that readers of all ages entered the relatively unfamiliar worlds created by BAME authors with interest.
For many, finding points of familiarity along gender, age, geographical or other lines was important for their ability to enjoy stories from communities different from their own. Identifying in this way gave some readers new perspectives on their own contexts. At the same time, unfamiliarity was not a barrier to identification. In some cases, universal human stories, like falling in love, acted as a bridge. This suggests that how literature is presented to readers, whether it is framed as other or not, can be as significant as what is represented.
Contemporary black and Asian writing from the UK is British writing. And this means that the work of writers such as Evaristo, Nadifa Mohamed and Daljit Nagra be placed on the same library shelf, reading list and section of the bookshop as work by Ian McEwan, Julian Barnes and Ali Smith – not exclusively in “world interest” or “global literature”.
Equally, much can be gained by thinking of white British writers like Alan Hollinghurst or Hilary Mantel as having as much of a cross-cultural or even postcolonial outlook as Aminatta Forna and Kamila Shamsie.
There are positive signs. A new EdExcel/Pearson A-level teaching resource on Contemporary Black British Literature has been developed. The Why is My Curriculum White? campaign continues to make inroads in university syllabuses. And the Jhalak Prize is raising the profile of BAME writing in Britain. Against this background, the Postcolonial Writers Make Worlds website offers a multimedia hub of resources on black and Asian British writing, providing points of departure for more inclusive, wide-ranging courses. Yet there is still much to be done.
All literature written in English in the British Isles is densely entangled with other histories, cultures, and pathways of experience both within the country and far beyond. Its syllabuses, publishing practices, and our conversations about books must reflect this.
Do you remember being taught you should never start your sentences with “And” or “But”?
What if I told you that your teachers were wrong and there are lots of other so-called grammar rules that we’ve probably been getting wrong in our English classrooms for years?
How did grammar rules come about?
To understand why we’ve been getting it wrong, we need to know a little about the history of grammar teaching.
Grammar is how we organise our sentences in order to communicate meaning to others.
Those who say there is one correct way to organise a sentence are called prescriptivists. Prescriptivist grammarians prescribe how sentences must be structured.
Prescriptivists had their day in the sun in the 18th century. As books became more accessible to the everyday person, prescriptivists wrote the first grammar books to tell everyone how they must write.
These self-appointed guardians of the language just made up grammar rules for English, and put them in books that they sold. It was a way of ensuring that literacy stayed out of reach of the working classes.
They took their newly concocted rules from Latin. This was, presumably, to keep literate English out of reach of anyone who wasn’t rich or posh enough to attend a grammar school, which was a school where you were taught Latin.
And yes, that is the origin of today’s grammar schools.
The other camp of grammarians are the descriptivists. They write grammar guides that describe how English is used by different people, and for different purposes. They recognise that language isn’t static, and it isn’t one-size-fits-all.
1. You can’t start a sentence with a conjunction
Let’s start with the grammatical sin I have already committed in this article. You can’t start a sentence with a conjunction.
Obviously you can, because I did. And I expect I will do it again before the end of this article. There, I knew I would!
Those who say it is always incorrect to start a sentence with a conjunction, like “and” or “but”, sit in the prescriptivist camp.
However, according to the descriptivists, at this point in our linguistic history,
it is fine to start a sentence with a conjunction in an op-ed article like this, or in a novel or a poem.
It is less acceptable to start a sentence with a conjunction in an academic journal article, or in an essay for my son’s high school economics teacher, as it turns out. But times are changing.
2. You can’t end a sentence with a preposition
Well, in Latin you can’t. In English you can, and we do all the time.
Admittedly a lot of the younger generation don’t even know what a preposition is, so this rule is already obsolete. But let’s have a look at it anyway, for old time’s sake.
According to this rule, it is wrong to say “Who did you go to the movies with?”
Instead, the prescriptivists would have me say “With whom did you go to the movies?”
I’m saving that structure for when I’m making polite chat with the Queen on my next visit to the palace.
That’s not a sarcastic comment, just a fanciful one. I’m glad I know how to structure my sentences for different audiences. It is a powerful tool. It means I usually feel comfortable in whatever social circumstances I find myself in, and I can change my writing style according to purpose and audience.
That is why we should teach grammar in schools. We need to give our children a full repertoire of language so that they can make grammatical choices that will allow them to speak and write for a wide range of audiences.
3. Put a comma when you need to take a breath
It’s a novel idea, synchronising your writing with your breathing, but the two have nothing to do with one another and if this is the instruction we give our children, it is little wonder commas are so poorly used.
Commas provide demarcation between like grammatical structures. When adjectives, nouns, phrases or clauses are butting up against each other in a sentence, we separate them with a comma. That’s why I put commas between the three nouns and the two clauses in that last sentence.
Commas also provide demarcation for words, phrases or clauses that are embedded in a sentence for effect. The sentence would still be a sentence even if we took those words away. See, for example, the use of commas in this sentence.
4. To make your writing more descriptive, use more adjectives
Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.
But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.
More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.
Rules, rules, rules
Nevertheless, however many adjectives we choose to employ, they do indeed tend to follow a predictable pattern. While native speakers intuitively follow this rule, most are unaware that they are doing so; we agree that the “red big dog” sounds wrong, but don’t know why. In order to test this intuition linguists have analysed large corpora of electronic data, to see how frequently pairs of adjectives like “big red” are preferred to “red big”. The results confirm our native intuition, although the figures are not as comprehensive as we might expect – the rule accounts for 78% of the data.
But while linguists have been able to confirm that there are strong preferences in the ordering of pairs of adjectives, no such statistics have been produced for longer strings. Consequently, while Forsyth’s rule appears to make sense, it remains an untested, hypothetical, large, sweeping (sorry) claim.
In fact, even if we stick to just two adjectives it is possible to find examples that appear to break the rule. The “big bad wolf” of fairy tale, for instance, shows the size adjective preceding the opinion one; similarly, “big stupid” is more common than “stupid big”. Examples like these are instead witness to the “Polyanna Principle”, by which speakers prefer to present positive, or indifferent, values before negative ones.
Another consideration of Forsyth’s proposed ordering sequence is that it makes no reference to other constraints that influence adjective order, such as when we use two adjectives that fall into the same category. Little Richard’s song “Long Tall Sally” would have sounded strange if he had called it Tall Long Sally, but these are both adjectives of size.
Similarly, we might describe a meal as “nice and spicy” but never “spicy and nice” – reflecting a preference for the placement of general opinions before more specific ones. We also need to bear in mind the tendency for noun phrases to become lexicalised – forming words in their own right. Just as a blackbird is not any kind of bird that is black, a little black dress does not refer to any small black dress but one that is suitable for particular kinds of social engagement.
Since speakers view a “little black dress” as a single entity, its order is fixed; as a result, modifying adjectives must precede little – a “polyester little black dress”. This means that an adjective specifying its material appears before those referring to size and colour, once again contravening Forsyth’s rule.
Making sense of language
Of course, the rule is a fair reflection of much general usage – although the reasons behind this complex set of constraints in adjective order remain disputed. Some linguists have suggested that it reflects the “nouniness” of an adjective; since colour adjectives are commonly used as nouns – “red is my favourite colour” – they appear close to that slot.
Another conditioning factor may be the degree to which an adjective reflects a subjective opinion rather than an objective description – therefore, subjective adjectives that are harder to quantify (boring, massive, middle-aged) tend to appear further away from the noun than more concrete ones (red, round, French).
Prosody, the rhythm and sound of poetry, is likely to play a role, too – as there is a tendency for speakers to place longer adjectives after shorter ones. But probably the most compelling theory links adjective position with semantic closeness to the noun being described; adjectives that are closely related to the noun in meaning, and are therefore likely to appear frequently in combination with it, are placed closest, while those that are less closely related appear further away.
In Forsyth’s example, it is the knife’s whittling capabilities that are most significant – distinguishing it from a carving, fruit or butter knife – while its loveliness is hardest to define (what are the standards for judging the loveliness of a whittling knife?) and thus most subjective. Whether any slight reorganisation of the other adjectives would really prompt your friends to view you as a knife-wielding maniac is harder to determine – but then, at least it’s just a whittling knife.
The British Council is to open a bilingual pre-school in Hong Kong in August. The International Pre-School, which will teach English and Cantonese and have specific times set aside for Mandarin, will follow the UK-based International Primary Curriculum.
The British Council already has bilingual pre-schools in Singapore (pictured above) and Madrid. The adoption of a bilingual model of early years learning, rather than a purely English-medium one, is supported by much of the research on this age group. In a randomised control trial in the US state of New Jersey, for example, three- and four-year-olds from both Spanish- and English-speaking backgrounds were assigned by lottery to either an all-English or English–Spanish pre-school programme which used an identical curriculum. The study found that children from the bilingual programme emerged with the same level of English as those in the English-medium one, but both the Spanish-speaking and anglophone children had a much higher level of Spanish.
After Brexit, there are various things that some in the EU hope to see and hear less in the future. One is Nigel Farage. Another is the English language.
In the early hours of June 24, as the referendum outcome was becoming clear, Jean-Luc Mélenchon, left-wing MEP and French presidential candidate, tweeted that “English cannot be the third working language of the European parliament”.
This is not the first time that French and German opinion has weighed in against alleged disproportionate use of English in EU business. In 2012, for example, a similar point was made about key eurozone recommendations from the European Commission being published initially “in a language which [as far as the Euro goes] is only spoken by less than 5m Irish”. With the number of native speakers of English in the EU set to drop from 14% to around 1% of the bloc’s total with the departure of the UK, this point just got a bit sharper.
Official EU language policy is multilingualism with equal rights for all languages used in member states. It recommends that “every European citizen should master two other languages in addition to their mother tongue” – Britain’s abject failure to achieve this should make it skulk away in shame.
The EU recognises 24 “official and working” languages, a number that has mushroomed from the original four (Dutch, French, German and Italian) as more countries have joined. All EU citizens have a right to access EU documents in any of those languages. This calls for a translation team numbering around 2,500, not to mention a further 600 full-time interpreters. In practice most day-to-day business is transacted in either English, French or German and then translated, but it is true that English dominates to a considerable extent.
The preponderance of English has nothing to do with the influence of Britain or even Britain’s membership of the EU. Historically, the expansion of the British empire, the impact of the industrial revolution and the emergence of the US as a world power have embedded English in the language repertoire of speakers across the globe.
Unlike Latin, which outlived the Roman empire as the lingua franca of medieval and renaissance Europe, English of course has native speakers (who may be unfairly advantaged), but it is those who have learned English as a foreign language – “Euro-English” or “English as a lingua franca” – who now constitute the majority of users.
According to the 2012 Special Eurobarometer on Europeans and their Languages, English is the most widely spoken foreign language in 19 of the member states where it is not an official language. Across Europe, 38% of people speak English well enough as a foreign language to have a conversation, compared to 12% speaking French and 11% in German.
The report also found that 67% of Europeans consider English the most useful foreign language, and that the numbers favouring German (17%) or French (16%) have declined. As a result, 79% of Europeans want their children to learn English, compared to 20% for French and German.
Too much invested in English
Huge sums have been invested in English teaching by both national governments and private enterprise. As the demand for learning English has increased, so has the supply. English language learning worldwide was estimated to be worth US$63.3 billion (£47.5 billion) in 2012, and it is expected that this market will rise to US$193.2 billion (£145.6 billion) by 2017. The value of English for speakers of other languages is not going to diminish any time soon. There is simply too much invested in it.
Speakers of English as a second language outnumber first-language English speakers by 2:1 both in Europe and globally. For many Europeans, and especially those employed in the EU, English is a useful piece in a toolbox of languages to be pressed into service when needed – a point which was evident in a recent project on whether the use of English in Europe was an opportunity or a threat. So in the majority of cases using English has precisely nothing to do with the UK or Britishness. The EU needs practical solutions and English provides one.
English is unchallenged as the lingua franca of Europe. It has even been suggested that in some countries of northern Europe it has become a second rather than a foreign language. Jan Paternotte, D66 party leader in Amsterdam, has proposed that English should be decreed the official second language of that city.
English has not always held its current privileged status. French and German have both functioned as common languages for high-profile fields such as philosophy, science and technology, politics and diplomacy, not to mention Church Slavonic, Russian, Portuguese and other languages in different times and places.
We can assume that English will not maintain its privileged position forever. Who benefits now, however, are not the predominantly monolingual British, but European anglocrats whose multilingualism provides them with a key to international education and employment.
Much about the EU may be about to change, but right now an anti-English language policy so dramatically out of step with practice would simply make the post-Brexit hangover more painful.
The EFL industry in Spain enjoyed a mini boom during the early years of the global economic crisis as many adult students rushed to improve their English language skills, either to get themselves back into the job market, or else in an attempt to hang on the job they had. As we reached the new decade, the boom slowed down and then started to tail-off. But no-one expected the sudden and significant drop in adult student numbers that hit the industry at the start of the current academic year.
The drop wasn’t school, city, or even region specific; it was the same story all over Spain. And the numbers were eye-watering. Depending who you talk to (and/or who you believe) adult student numbers fell by between 10-20%. Enough to make any school owner or manager wince.
What happened? Where did all these students go? Well, as is normally the case, there is no one, simple answer. There has been a slight upturn in in-company teaching, so it may be that some students, who were previously paying for their own courses in our schools, are now studying in their company (if they’re fortunate to have a job in the first place; Spanish unemployment is still well over 20%.)
The standard of English teaching in main-stream education is also getting better, slowly, so it may be that there are more school leavers who have achieved a basic level of communicative competence.
Some adult students – especially the younger ones – may also have decided to switch from a traditional, bricks and mortar language school to a Web-based classroom.
My own theory is that it’s the free movement of labour in the European Union which is having the greatest effect on our market. In other words, as there so few jobs available in Spain, hundreds of thousands of young adults – many of whom may previously have been our students – have simply upped sticks and gone abroad to find work.
A recent survey conducted in the UK indicates that migrants from Spain rose to 137,000 in 2015 (up from 63,000 in 2011). Most of them are probably working in relatively unskilled jobs in hotels, bars and restaurants, but at least they’re working – and they’re improving their English language skills as they go.
A similar number probably emigrated to other countries in the north of Europe and another significant number emigrated to Latin America. Add up all these emigrants and we could be looking at a total of well over 300,000 migrants – just in 2015.
On a recent trip to Oxford I met a young Spanish guy, working in a hotel, who had previously been a student at our school in Barcelona. He’s a typical example. Will he ever move back to Spain, I asked him? Perhaps, in the future, he said, but only if the situation in Spain changes and he can find a decent job. His new fluency in English, learnt by living and working in Oxford, might just help him with that.
So where does that leave Spanish language schools? Will adult students come back to our schools in the same numbers as before? Probably not. But that doesn’t mean we have to give up on this market. If adult students won’t come to us, we can use the Internet to take our services to them. Even those living and working abroad.
This year’s winners – Jairam Hathwar from Painted Post, New York and Nihar Janga from Austin, Texas – present a familiar combination of co-champions. Jairam is the younger brother of 2013 co-champion Sriram, who also dueled with a Texan to ultimately share the trophy.
As a topic of intense speculation on broadcast and social media, the wins have elicited comments that range from curiosity to bafflement and at times outright racism. This curiosity is different from past speculation about “whether home-schooled spellers have an advantage.”
The range of responses offers a moment to consider some of the factors underlying the Indian-American success at the bee, as well as how spelling as a sport has changed. Immediately following the 2016 bee, for instance, much of the coverage has focused on the exceedingly high level of competition and drama that characterized the 25-round championship battle that ultimately resulted in a tie.
Since 2013, I have been conducting research on competitive spelling at regional and national bees with officials, spellers and their families, and media producers.
My interviews and observations reveal the changing nature of spelling as a “brain sport” and the rigorous regimens of preparation that competitive spellers engage in year-round. Being an “elite speller” is a major childhood commitment that has intensified as the bee has become more competitive in recent years.
Let’s first look at history
South Asian-American spelling success is connected to the history of this ethnic community’s immigration to the United States.
For instance, the 1965 Hart-Cellar Act solicited highly trained immigrants to meet America’s need for scientists, engineers and medical professionals and opened the door to skilled immigration from Asia and other regions. In subsequent decades, skilled migration from South Asia continued alongside the sponsorship of family members.
Today, along with smaller, older communities of Punjabi Sikhs and other South Asian ethnic groups primarily on the West Coast, South Asian-Americans constitute a diverse population that features a disproportionately high professional class, although with differences of class, languages, ethnicities and nationalities – differences that are often overlooked in favor of a narrative of Indian-American educational and professional success.
The question is, what gives the community an edge?
For upwardly mobile South Asian-Americans, success is in part due to moving from one socially and economically advantageous societal position in the subcontinent to another in the United States.
Moreover, the English-speaking abilities of most educated South Asian-Americans clearly give them an edge over immigrants from other countries. My research indicates that fluency developed in English-medium schools – a legacy of British colonialism – makes them ideal spelling interlocutors for their children, despite their variety of British spelling. Members of this population with elite educational qualifications have likewise emphasized the importance of academic achievement with their children.
Over the past few years spelling bees have been established exclusively for children of South Asian parentage.
For instance, the North South Foundation holds a range of educational contests, such as spelling bees, math contests, geography bees and essay writing, among others, whose proceeds contribute to promoting literacy efforts in India. The South Asian Spelling Bee, partnering with the insurance company Metlife, offers a highly competitive bee as well.
Taken together, this “minor league” circuit gives South Asian-American spellers far more opportunities to compete, as well as a longer “bee season” to train and practice.
This is particularly helpful because, as past champions confirm, ongoing practice and training are the key to winning.
Another factor to note here is the parental ability to dedicate time to education and extracurricular activities. Predictably, families with greater socioeconomic means are able to devote more resources and time.
These parents are as invested in spelling bees and academic competitions as families with star athletes or musicians might be in their children’s matches or performances. As several parents explained to me, spelling bees are the “brain sports” equivalent of travel soccer or Little League.
Of the 30 families I interviewed, the majority had a stay-at-home parent (usually the mother) dedicated to working with children on all activities, including spelling. In dual-income households, spelling training occurred on weeknights and weekends.
Like elite spellers of any race or ethnicity, South Asian-American spellers I spoke with studied word lists daily if possible, logging in several hours on weekends with parents or paid coaches to help them develop strategies and quiz them on words.
A few parents have been so invested in helping their children prepare that they have now started training and tutoring other aspiring spellers as well.
Like any national championship, the pressure on all spellers at a competition on the scale of the National Spelling Bee is intense. South Asian-American children are already subject to living up to the model minority stereotype and feel no reprieve here.
This is especially important to consider when South Asian-American spellers come from lower socioeconomic classes, but nonetheless succeed at spelling bees.
Among the 2015 finalists, for instance, one was the son of motel owners and a crowd favorite, as I observed. He had competed in the bee several times, and his older sister was also a speller, having made it to nationals once. Remarkably, they prepared for competitions by themselves, with no stay-at-home parent or paid coach.
Another 2015 semifinalist was featured in a broadcast segment living in the crowded immigrant neighborhood of Flushing, New York. When I visited this three-time National Spelling Bee participant in 2014, I realized that she lived in the very same apartment complex that my family did in the 1970s. This Queens neighborhood continues to be a receiving area for Indian-Americans who may not have the economic means to live in wealthier sections of New York City or its suburbs.
Many possible explanations
The point is that the reasons that Indian-American spellers are succeeding at the bee are not easily reducible to one answer.
South Asian-Americans, like other Asian immigrants, comprise varying class backgrounds and immigration histories. Yet it is noteworthy that even within this range of South Asian-American spellers, it is children of Indian-American immigrants from professional backgrounds who tend to become champions.
The time and resources Indian-American families devote to this brain sport, as I have observed, appear to be raising this competition into previously unseen levels of difficulty.
This can take a toll on elite spellers, who have to invest far more time studying spelling than in the past. With more difficult words appearing in earlier rounds of competition, spelling preparation can take up much of their time outside of school.
Nonetheless, they emphasize the perseverance they develop from competitive spelling. They learn to handle increasing levels of pressure, and alongside this, what they identify as important life skills of focus, poise and concentration.
Ultimately, what makes Indian-American children successful at spelling is the same as children of any other ethnicity. They come from families who believe in the value of education and also have the financial means to support their children through every stage of their schooling. And, they are highly intelligent individuals who devote their childhood to the study of American English.
Are they American?
Some comments on social media, however, seem to discount these factors and years of intense preparation to instead focus on race and ethnicity as sole factors for spelling success.
In a refreshing shift in tone, this year’s topics also included the ferocity of Janga’s competition style and the inspiration he drew from his football hero Dez Bryant.
Nonetheless, such comments, directed toward nonwhite children when they win this distinctly American contest, do push us to reflect: what does it mean to be an American now?
In alleging that only “Americans” should win this contest, Twitter racists ignore that these spellers too have been born and raised in the United States. Recent winners hail from suburban or small towns in upstate New York, Kansas, Missouri and Texas. They express regional pride in these locations by mentioning regional sports teams and other distinctive features in their on-air profiles.
With their American-accented English and distinctly American comportment, it is merely their skin color and names that set them apart from a white mainstream.
Like generations of white Americans and European immigrants, Indian-American parents spend countless hours preparing word lists, quizzing their children and creating ways for their children to learn. They encourage their children in whatever they are good at, including spelling.
As a result, they have elevated this American contest to a new level of competition. Clearly, this is an apt moment to expand our definition of what it means to be an American.
This is an updated version of an article first published on June 4, 2015.
Growing up in China, I started playing piano when I was nine years old and learning English when I was 12. Later, when I was a college student, it struck me how similar language and music are to each other.
Language and music both require rhythm; otherwise they don’t make any sense. They’re also both built from smaller units – syllables and musical beats. And the process of mastering them is remarkably similar, including precise movements, repetitive practice and focused attention. I also noticed that my musician peers were particularly good at learning new languages.
All of this made me wonder if music shapes how the brain perceives sounds other than musical notes. And if so, could learning music help us learn languages?
Music experience and speech
Music training early in life (before the age of seven) can have a wide range of benefits beyond musical ability.
For instance, school-age children (six to eight years old) who participated in two years of musical classes four hours each week showed better brain responses to consonants compared with their peers who started one year later. This suggests that music experience helped children hear speech sounds.
But what about babies who aren’t talking yet? Can music training this early give babies a boost in the steps it takes to learn language?
The first year of life is the best time in the lifespan to learn speech sounds; yet no studies have looked at whether musical experience during infancy can improve speech learning.
I sought to answer this question with Patricia K. Kuhl, an expert in early childhood learning. We set out to study whether musical experience at nine months of age can help infants learn speech.
Nine months is within the peak period for infants’ speech sound learning. During this time, they’re learning to pay attention to the differences among the different speech sounds that they hear in their environment. Being able to differentiate these sounds is key for learning to speak later. A better ability to tell speech sounds apart at this age is associated with producing more words at 30 months of age.
Here is how we did our study
In our study, we randomly put 47 nine-month-old infants in either a musical group or a control group and completed 12 15-minute-long sessions of activities designed for that group.
Babies in the music group sat with their parents, who guided them through the sessions by tapping out beats in time with the music with the goal of helping them learn a difficult musical rhythm.
Here is a short video demonstration of what a music session looked like.
Infants in the control group played with toy cars, blocks and other objects that required coordinated movements in social play, but without music.
After the sessions, we measured the babies’ brains responses to musical and speech rhythms using magnetoencephalography (MEG), a brain imaging technique.
New music and speech sounds were presented in rhythmic sequences, but the rhythms were occasionally disrupted by skipping a beat.
These rhythmic disruptions help us measure how well the babies’ brains were honed to rhythms. The brain gives a specific response pattern when detecting an unexpected change. A bigger response indicates that the baby was following rhythms better.
Babies in the music group had stronger brain responses to both music and speech sounds compared with babies in the control group. This shows that musical experience, as early as nine month of age, improved infants’ ability to process both musical and speech rhythms.
These skills are important building blocks for learning to speak.
Another researcher, Laura Cirelli, showed that 14-month-old babies were more likely to show helping behaviors toward an adult after the babies had been bounced in sync with the adult who was also moving rhythmically.
There are many more exciting questions that remain to be answered as researchers continue to study the effects of music experience on early development.
For instance, does the music experience need to be in a social setting? Could babies get the benefits of music from simply listening to music? And, how much experience do babies need over time to sustain this language-boosting benefit?
Music is an essential part of being human. It has existed in human cultures for thousands of years, and it is one of the most fun and powerful ways for people to connect with each other. Through scientific research, I hope we can continue to reveal how music experience influences brain development and language learning of babies.
Instead, it demonstrates our current ability to promulgate such nonsense words, allowing them to gain sudden currency, perhaps through “trending”, to make use of another relative newcomer to the fold, or “retweeting”. (These are not to be confused with the newcomer “subtweets” which themselves perpetuate another long tradition in English – that of making snide remarks through indirect allusion in a public arena. Alexander Pope would have been a great subtweeter.)
Some of the newly accepted words make one of the main processes of linguistic evolution clear: that of creating a new word by analogy with one already in use. “Binge-watching” is the clearest example. This is the viewing of several episodes or indeed whole series of a televised drama in one sitting. This word is clearly created by analogy with “binge-drinking”, which came to replace the phrase “going on a binge” or “going on a bender” when referring to drinking large amounts of alcohol over a short space of time.
Yes, there’s a difference here – where the earlier two phrases indicated that the occurrence was infrequent, if not actually unusual, “binge-drinking” is habitual, normally taking place at weekends, much as “binge-watching” does for many. I’d like to think that “binge-browsing” might be next, with the specific meaning of spending hours browsing the OED site when one visited to look up just one word. But possibly this is not a habit to encourage, after all, ”YOLO“.
Such changes always provoke reaction. Reliably, this varies from outrage at the abuse of language and ignorance of etymological development that such words betray, to celebration of English as a language flexible enough to admit such vibrant new forms and accommodate the creativity of its users.
But what’s interesting to me, as someone whose most frequent uses of dictionaries are to correct spelling and check historical usage, is the way that great institution, the Oxford English Dictionary, is able to satisfy two roles at once. This is thanks to its dual format – in print and online. It’s the online version that will soon include “listicle” and the rest, with no guarantee that these words will make it into the next print version (assuming there is one, which is what the current distinction between print and online versions implies).
This allows for the OED to record passing uses and trends without compromising its role as final arbitrator on whether or not a word can be said to have entered the English language. This is, after all, a decision which to a large extent depends on proving that word not only gained currency but retained a decent, level of recorded usage over a period of time and, crucially, in print.
And so print retains its sense of permanence in the face of ephemeral but ubiquitous electronic media. Or apparently ephemeral. The recent ruling requiring Google in particular to “remove” records from the internet has reminded us that it is in fact all but impossible to delete anything committed to the electronic ether – however paradoxical that seems. It’s all still out there, it’s just no longer appearing in the search results.
Googling itself is a word now accepted by the online OED, and while at first its currency was an indicator of the success of the company, it’s interesting to speculate on the survival of the word should Google itself go under, or lose its predominant position. Would we then all revert to “web-searching” for background information, or would we google, just as we hoover, forgetful the fact that the common verb once indicated a specific, dominant company?
Can’t remember the name of the two elements that scientist Marie Curie discovered? Or who won the 1945 UK general election? Or how many light years away the sun is from the earth? Ask Google.
Constant access to an abundance of online information at the click of a mouse or tap of a smartphone has radically reshaped how we socialise, inform ourselves of the world around us and organise our lives. If all facts can be summoned instantly by looking online, what’s the point of spending years learning them at school and university? In the future, it might be that once young people have mastered the basics of how to read and write, they undertake their entire education merely through accessing the internet via search engines such as Google, as and when they want to know something.
Some educational theorists have argued that you can replace teachers, classrooms, textbooks and lectures by simply leaving students to their own devices to search and collect information about a particular topic online. Such ideas have called into question the value of a traditional system of education, one in which teachers simply impart knowledge to students. Of course, others have warned against the dangers of this kind of thinking and the importance of the teacher and human contact when it comes to learning.
Such debate about the place and purpose of online searching in learning and assessments is not new. But rather than thinking of ways to prevent students from cheating or plagiarising in their assessed pieces of work, maybe our obsession with the “authenticity” of their coursework or assessment is missing another important educational point.
Digital content curators
In my recent research looking at the ways students write their assignments, I found that increasingly they may not always compose written work which is truly “authentic”, and that this may not be as important as we think. Instead, through prolific use of the internet, students engaged in a number of sophisticated practices to search, sift, critically evaluate, anthologise and re-present pre-existing content. Through a close examination of the moment-by-moment work of the way students write assignments, I came to see how all the pieces of text students produced contained elements of something else. These practices need to be better understood and then incorporated into new forms of education and assessment.
These online practices are about harnessing an abundance of information from a multitude of sources, including search engines like Google, in what I call a form of “digital content curation”. Curation in this sense is about how learners use existing content to produce new content through engaging in problem-solving and intellectual inquiry, and creating a new experience for readers.
Lessons in how to search. Students via bikeriderlondon/www.shutterstock.com
Part of this is developing a critical eye about what’s being searched for online, or “crap-detection”, whilst wading through the deluge of available information. This aspect is vital to any educationally serious notion of information curation, as learners increasingly use the web as extensions of their own memory when searching.
Students must begin by understanding that most online content is already curated by search engines like Google using their PageRank algorithm and other indicators. Curation, therefore, becomes a kind of stewardship of other people’s writing and requires entering into a conversation with the writers of those texts. It is a crucial kind of ‘digital literacy’
Curation has, through pervasive connectivity, found its way into educational contexts. There is now a need to better understand how practices of online searching and the kinds of writing emerging from curation can be incorporated into the way we assess students.
How to assess these new skills
While writing for assessment tends to focus on the production of a student’s own, “authentic” work, it could also take curation practices into account. Take, for example, a project designed as a kind of digital portfolio. This could require students to locate information on a particular question, organise existing web extracts in a digestible and story-like way, acknowledge their sources, and present an argument or thesis.
Solving problems through synthesising large amounts of information, often collaboratively, and engaging in exploratory and problem-solving pursuits (rather than just memorising facts and dates) are key skills in the 21st century, information-based economy. As the London Chamber of Commerce has highlighted, we must make sure young people and graduates enter employment with these skills.
My own research has shown that young people may already be expert curators as part of their everyday internet experience and surreptitious assignment writing strategies. Teachers and lecturers need to explore and understand these practices better, and create learning opportunities and academic assessment tasks around these somewhat “hard to assess” skills.
In an era of informational abundance, educational end-products – the exam or piece of coursework – need to become less about a single student creating an “authentic” text, and more about a certain kind of digital literacy which harnesses the wisdom of the network of information that is available at the click of a button.
by Gianfranco Conti, PhD. Co-author of 'The Language Teacher toolkit' and "Breaking the sound barrier: teaching learners how to listen', winner of the 2015 TES best resource contributor award and founder of www.language-gym.com