Why 1904 testing methods should not be used for today’s students


Robert Sternberg, Cornell University

When I was an elementary school student, schools in my hometown administered IQ tests every couple of years. I felt very scared of the psychologist who came in to give those tests.

I also performed terribly. As a result, at one point, I was moved to a lower-grade classroom so I could take a test more suitable to my IQ level.

Consequently, I believed that my teachers considered me stupid. I, of course, thought I was stupid. In addition, I also thought my teachers expected low-quality work from a child of such low IQ. So, I gave them what they expected.

Had it not been for my fourth grade teacher, who thought there was more to a person than an IQ test score, I almost certainly would not be a professor today.

You might think things have gotten better. Not quite. I have two generations of children (from different marriages), and something similar happened to both my sons: Seth, age 36, now a successful entrepreneur in Silicon Valley, and Sammy, age four.

Some children as young as Sammy take preschool tests. And almost all our students – at least those wanting to go on to college – take what one might call proxies for IQ tests – the SAT and ACT – which are IQ tests by another name.

Testing is compromising the future of many of our able students. Today’s testing comes at the expense of validity (strong prediction of future success), equity (ensuring that members of various groups have an equal shot), and common sense in identifying those students who think deeply and reflectively rather than those who are good at answering shallow multiple-choice questions.

How should today’s students be assessed?

Intelligence tests in Halloween costumes

Psychology professor Douglas Detterman and his colleagues have shown that the SAT and the ACT are little more than disguised IQ tests.

They may look slightly different from the IQ tests, but they closely resemble the intelligence tests used by Charles Spearman (1904), Alfred Binet and Theodore Simon (1916), famous psychologists in Great Britain and France, respectively, who created the first IQ tests a century ago.

While these tests may have been at the cutting edge at the turn of the 20th century, today they are archaic. Imagine using medical tests designed at the beginning of the 20th century to diagnose, say, cancer or heart disease.

Multiple choice questions don’t teach life skills.
biologycorner, CC BY-NC

People’s success today scarcely hinges on solving simple, pat academic problems with unique solutions conveniently presented as multiple-choice options.

When your kids (or colleagues) misbehave, does anyone give you five options, one of which is uniquely correct, to solve the problem of how to get them to behave properly?

Or, are there any multiple-choice answers for how to solve serious crises, whether in international affairs (eg, in Syria), in business (eg, at Volkswagen) or in education (eg, skyrocketing college tuitions)?

How do we test for success?

The odd thing is that we can do much better. That would mean taking into account that academic and life success involves much more than IQ.

In my research conducted with my colleagues who include Florida State University professor Richard Wagner and a former professor at the US Military Academy at West Point, George Forsythe, we found that success in managerial, military and other leadership jobs can be predicted independent of IQ levels.

More generally, we have found that practical intelligence, or common sense, is itself largely independent of IQ. Moreover, my research with Todd Lubart, now a professor at the University of Paris V, has shown that creative intelligence also is distinct from IQ.

My colleagues and I, including Professor Elena Grigorenko at Yale, have shown in studies on five continents that children from diverse cultures, such as Yup’ik Eskimos in Alaska, Latino-American students in San Jose, California, and rural Kenyan schoolchildren, may have practical adaptive skills that far exceed those of their teachers (such as how to hunt in the frozen tundra, ice-fish, or treat parasitic illnesses such as malaria with natural herbal medicines).

Yet teachers – and IQ tests – may view these children as intellectually challenged.

What are we testing, anyway?

Our theory of “successful intelligence” can help predict the academic, extracurricular and leadership success of college students. In addition, it could increase applications from qualified applicants and decrease differences among ethnic groups, such as between African-American and Euro-American students, that are found in the SAT/ACT.

The idea behind “successful intelligence” is not only to measure analytical skills as is done by the SAT/ACT, but also other skills that are important to college and life success. Although this does mean additional testing, it is an assessment of strength-based skills that actually are fun to take.

What are these other skills and assessments, exactly?

The truth is, you can’t get by in life only on analytical skills – you also need to come up with your own new ideas (creativity), know how to apply your ideas (practical common sense), and ensure they benefit others beside yourself (wisdom).

So, assessments of “successful intelligence” would measure creativity, common sense and wisdom/ethics, in addition to analytical skills, as measured by the SAT/ACT.

Here is how measurement of successful intelligence works:

Creative skills can be measured by having students write or tell a creative story, design a scientific experiment, draw something new, caption a cartoon or suggest what the world might be like today if some past event (such as the defeat of the Nazis in World War II) had turned out differently.

Practical skills can be measured by having students watch several videos of college students facing practical problems – and then solving the problems for the students in the videos, or by having students comment on how they persuaded a friend of some ideas that the friend did not initially accept.

Wisdom-based and ethical skills can be measured by problems such as what to do upon observing a student cheating, or commenting on how one could, in the future, make a positive and meaningful difference to the world, at some level.

A new way to test

My collaborators and I first tested our ideas between 2000 and 2005 when I was IBM professor of psychology and education and professor of management at Yale. We found (in our “Rainbow Project”) that we could double prediction of freshman-year grades over that obtained from the SAT.

Also, relative to the SAT, we reduced by more than half ethnic-group differences between Euro-Americans, Asian-Americans, African-Americans, Latino-Americans and American Indians.

Later in 2011, I engaged, in collaboration with Lee Coffin, dean of undergraduate admissions at Tufts University, in a project called Kaleidoscope. At the time, I was dean of arts and sciences at Tufts. Kaleidoscope was optional for all undergraduate applicants to Tufts – tens of thousands did Kaleidoscope over the years.

We increased prediction not only of academic success, but also of extracurricular and leadership success, while greatly reducing ethnic-group differences.

Then again, when I was provost and senior vice president of Oklahoma State University (OSU), in collaboration with Kyle Wray, VP for enrollment management, we implemented a similar program at OSU (called the “Panorama Project”) that also was available to all applicants.

The measures are still being used at Tufts and at Oklahoma State. These projects have resulted in students being admitted to Tufts and OSU who never would have made it on the basis of the high school GPAs and SATs.

On our assessments, the students displayed potential that was hidden by traditional standardized tests and even by high school grades.

The problem of being stuck

So why don’t colleges move on?

There are several reasons, but the most potent is sheer inertia and fear of change.

College and university presidents and admissions deans around the country have revealed to me in informal conversations that they want change but are afraid to rock the boat.

There are other ways of testing kids.
BarbaraLN, CC BY-SA

Moreover, because the SAT, unlike our assessment, is highly correlated with socioeconomic status, colleges like it. College tuition brings in big money, and anything that could affect the dollars is viewed with fear. Students who do well on standardized tests are more likely to be full-pay students, an attraction to institutions of higher learning.

As I know only too well, colleges mostly do what they did before, and changes often require approval of many different stakeholders. The effort to effect change can be daunting.

Finally, there is the problem of self-fulfilling prophecy. We use conventional standardized tests to select students. We then give those high-scoring students better opportunities not only in college but for jobs in our society.

As a result, the tests often make their predictions come true. Given my family history, I know all too well how real the problem of self-fulfilling prophecies is.

The Conversation

Robert Sternberg, Professor of Human Development, Cornell University

This article was originally published on The Conversation. Read the original article.


Signs of our times: why emoji can be even more powerful than words


Vyvyan Evans, Bangor University

Each year, Oxford Dictionaries – one of the world’s leading arbiters on the English language – selects a word that has risen to prominence over the past 12 months as its “Word of the Year”. The word is carefully chosen, based on a close analysis of how often it is used and what it reveals about the times we live in. Past examples include such classics as “vape”, “selfie” and “omnishambles”.

But the 2015 word of the year is not a word at all. It’s an emoji – the “face with tears of joy” emoji, to be precise.

Formerly regarded with disdain as the textual equivalent of an adolescent grunt, it appears that emoji has now gone mainstream. Even if it’s not a fully-fledged language, then it is – at the very least – something that most of us use, most of the time. In fact, more than 80% of all adult smartphone users in the UK regularly use emoji, a finding based on a study I reported in an earlier article.

Yet predictably, Oxford Dictionaries’ selection has raised eyebrows in some quarters. Writing in The Guardian, Hannah Jane Parkinson brands the decision “ridiculous”. For Parkinson, and I’m sure for many other language mavens out there, it’s “ridiculous” because the emoji is not even a word. Surely this is a stunt, they’ll say, dreamt up by clever marketing executives bent on demonstrating just how hip Oxford Dictionaries actually is.

But Parkinson also objects on the basis that there are many other emojis which would make a better word of the year. She suggests the nail painting emoji and the aubergine (or eggplant) emoji as just two examples which have a stronger claim to the title.

Missing the point

But both these complaints miss the point. Emoji – from the Japanese meaning “picture character” (a word which only entered the Oxford Dictionaries in 2013) – is in many respects language-like. Spoken or signed language enables us to convey a message, influence the mental states and behaviours of others and enact changes to our civil and social status. We use language to propose marriage, and confirm it, to quarrel, make-up and get divorced. Yet emoji has similar functions – it can even get you arrested!

Consider an unusual case from earlier this year: a 17-year-old African American teenager posted a public status update on his Facebook page, featuring a police officer emoji with handgun emojis pointing towards it. This landed him in hot water: the New York District Attorney issued an arrest warrant, for an alleged “terroristic threat”, claiming that the emojis amounted to a threat to harm, or incite others to cause harm, to New York’s finest.

A grand jury ultimately declined to indict the teenager for what is arguably the world’s first alleged emoji terror offence. But the point is that emojis, like language, can both convey a message and provide a means of enacting it – in this case, an alleged call to arms against the NYPD.

Like our treasured English words, emojis are powerful instruments of thought and, potentially, persuasion. Just like language, they can and will be used as evidence against you in a court of law. In short, those who dismiss the language-like nature of emoji fundamentally misunderstand how human communication works in our brave new digital world.

Evolution of the emoji

The second complaint – that there are other emojis more deserving of Oxford Dictionaries’ esteem – also misunderstands how language is evolving in the digital domain.

Emoji perfection
from http://www.shutterstock.com

For one thing, recent research suggests that just under 60% of the world’s daily emoji use is made up of smiling or sad faces, of various kinds. And this particular emoji now accounts for around 20% of all emoji usage in the UK (representing a fourfold increase in use over the past 12 months). It is arguably one of the most frequently used emojis today. In this sense, the “face with tears of joy” emoji is a perfectly appropriate representation of the main ways we use emoji in our everyday digital lives.

Yet this specific emoji is apt for a deeper reason, too. Emoji is to text-speak what intonation, facial expression and body language are to spoken interaction. While emoji are not conventional words, they nevertheless provide an important contextualisation cue, which enables us to punctuate the otherwise emotionally arid landscape of digital text with personal expression.

Importantly, emoji helps us to elicit empathy from the person we’re addressing – a central requirement of effective communication. It allows us to influence the way our text is interpreted and better express our emotional selves.

One could even argue that, in some ways, emojis are more powerful than words. The “laughing face with tears of joy” emoji effectively conveys a complex emotional spectrum – which would otherwise require several words to convey – in a single, relatively simple glyph. It manages to evoke an immediate emotional resonance, which might otherwise be lost in a string of words.

Occasionally, emojis they can even replace words – this is what linguists refer to as code-switching. In more extreme examples – such as the translation of literary works such as Alice in Wonderland – they function exclusively as words and are also given grammatical structure. There’s truly no arguing with the expressive power of emoji.

So while some will unkindly accuse Oxford Dictionaries of a marketing stunt, I applaud them. We are increasingly living in an age of emoji: they are, quite literally, a sign of our times. There’s no doubt that language is here to stay – the great English word is not in peril, and won’t be any time soon. But emoji fills a gap in digital communication – and makes us better at it in the process.

The Conversation

Vyvyan Evans, Professor of Linguistics, Bangor University

This article was originally published on The Conversation. Read the original article.

What will the English language be like in 100 years?


Simon Horobin, University of Oxford

One way of predicting the future is to look back at the past. The global role English plays today as a lingua franca – used as a means of communication by speakers of different languages – has parallels in the Latin of pre-modern Europe.

Having been spread by the success of the Roman Empire, Classical Latin was kept alive as a standard written medium throughout Europe long after the fall of Rome. But the Vulgar Latin used in speech continued to change, forming new dialects, which in time gave rise to the modern Romance languages: French, Spanish, Portuguese, Romanian and Italian.

Similar developments may be traced today in the use of English around the globe, especially in countries where it functions as a second language. New “interlanguages” are emerging, in which features of English are mingled with those of other native tongues and their pronunciations.

Despite the Singaporean government’s attempts to promote the use of Standard British English through the Speak Good English Movement, the mixed language known as “Singlish” remains the variety spoken on the street and in the home.

Spanglish, a mixture of English and Spanish, is the native tongue of millions of speakers in the United States, suggesting that this variety is emerging as a language in its own right.

Meanwhile, the development of automatic translation software, such as Google Translate, will come to replace English as the preferred means of communication employed in the boardrooms of international corporations and government agencies.

So the future for English is one of multiple Englishes.

Looking back to the early 20th century, it was the Standard English used in England, spoken with the accent known as “Received Pronunciation”, that carried prestige.

But today the largest concentration of native speakers is in the US, and the influence of US English can be heard throughout the world: can I get a cookie, I’m good, did you eat, the movies, “skedule” rather than “shedule”. In the future, to speak English will be to speak US English.

US spellings such as disk and program are already preferred to British equivalents disc and programme in computing. The dominance of US usage in the digital world will lead to the wider acceptance of further American preferences, such as favorite, donut, dialog, center.

What is being lost?

In the 20th century, it was feared that English dialects were dying out with their speakers. Projects such as the Survey of English Dialects (1950-61) were launched at the time to collect and preserve endangered words before they were lost forever. A similar study undertaken by the BBC’s Voices Project in 2004 turned up a rich range of local accents and regional terms which are available online, demonstrating the vibrancy and longevity of dialect vocabulary.

But while numerous dialect words were collected for “young person in cheap trendy clothes and jewellery” – pikey, charva, ned, scally – the word chav was found throughout England, demonstrating how features of the Estuary English spoken in the Greater London area are displacing local dialects, especially among younger generations.

The turn of the 20th century was a period of regulation and fixity – the rules of Standard English were established and codified in grammar books and in the New (Oxford) English Dictionary on Historical Principles, published as a series of volumes from 1884-1928. Today we are witnessing a process of de-standardisation, and the emergence of competing norms of usage.

In the online world, attitudes to consistency and correctness are considerably more relaxed: variant spellings are accepted and punctuation marks omitted, or repurposed to convey a range of attitudes. Research has shown that in electronic discourse exclamation marks can carry a range of exclamatory functions, including apologising, challenging, thanking, agreeing, and showing solidarity.

Capital letters are used to show anger, misspellings convey humour and establish group identity, and smiley-faces or emoticons express a range of reactions.

Getting shorter

Some have questioned whether the increasing development and adoption of emoji pictograms, which allow speakers to communicate without the need for language, mean that we will cease to communicate in English at all? 😉

The fast-changing world of social media is also responsible for the coining and spreading of neologisms, or “new words”. Recent updates to Oxford Dictionaries give a flavour: mansplaining, awesomesauce, rly, bants, TL;DR (too long; didn’t read).

How Oxford Dictionaries choose which new words to include.

Clipped forms, acronyms, blends and abbreviations have long been productive methods of word formation in English (think of bus, smog and scuba) but the huge increase in such coinages means that they will be far more prominent in the English of 2115.

Whether you 👍 or h8 such words, think they are NBD or meh, they are undoubtedly here to stay.

The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article was originally published on The Conversation. Read the original article.

Drop the negative spin on kids who start school bilingual – they are a rich resource for the future


Frank Monaghan, The Open University

There are now more than 1.1 million children in our schools whose first language “is known or believed to be other than English” according to the latest government figures. This confirms a continuous upwards trend that shows no sign of abating.


Many of the 300 or so languages spoken in schools have relatively few speakers but about 20 languages are spoken by 10,000 or more pupils. These children represent a considerable resource. But we are not making the most of it and are even cutting specialist language support for these pupils.

Report after report from the British Chamber of Commerce to the Confederation of British Industry stresses the need for employees with good foreign language skills. Businesses bemoan the fact that the current lack of language skills is a significant barrier to them even attempting to do deals overseas.

In one survey, while 73% of business owners claimed to speak some French, only 4% of them felt confident they were sufficiently skilled to be able to conduct business in the language. The situation is even worse in the fastest growing markets. Less than 1% of business owners believe they have the skills to conduct business in Chinese.

Over the last 15 years, more than a third of UK universities have stopped offering specialist modern European languages degrees, with all the loss of skills that implies.

Optimists at the Department for Education point to healthy rises in the uptake of GCSE foreign languages following the introduction of the English Baccalaureate league table. But while this level of qualification may equip you to order a Bratwurst in a Berlin Bierkeller, it won’t be enough you sell your goods to the German businessmen sharing your table.

Multilingual talent pool

And yet, if we look at figures on the dozen or so largest languages spoken in schools, we’ll find a potential talent pool just waiting to be tapped.


One of the main problems are the negative signals sent out whenever news of the growing linguistic diversity in our schools hits the press. This happened earlier this year over reports that one in nine schools had a majority of students whose first language was other than English.

This generally leads to an entirely false assumption that these students don’t speak English, accompanied by predictable outcries about the negative impact of immigration on standards in our schools. In fact, my research has found that of the ten schools with the highest proportion of bilingual students all but two were rated as outstanding or good by Ofsted.

Chinese students are our highest performing group and the presence of so many Polish students has helped improve the position of many of our Catholic schools in the league tables, as shown by a study carried out by the LSE in 2012.

And the 2012 Programme for International Student Assessment (PISA) tests carried out by the Organisation for Economic Co-operation and Development, showed that the performance of “immigrant” (first or second generation) students in the UK overall was not statistically different from that of “native” students. They were only six points adrift on the mathematics test when socio-economic factors were accounted for. This compared with the OECD average of 34 points lower for immigrant students overall.

Cutbacks and short-sightedness

Yet massive cuts to specialist provision in schools and local authorities for the teaching of English as an additional language seems likely to reverse that trend, as the National Association for Language Development in the Curriculum has warned.

We need a much better understanding about what that label “language other than English” actually means. Students with English as an additional language come in all shapes and sizes, from absolute beginners to students who were born in the UK and have had all their education here.

The problem is the UK does not have any standardised way of assessing their ability in English (unlike other places such as Canada and Australia, both of which were ranked much higher than the UK in the 2012 PISA tests). Unless we develop a curriculum that recognises that we not only live in a multilingual world but also in a multilingual Britain, then we will be selling all our children and all our futures short. The revised National Curriculum, with its ever narrowing focus on “little England” seems unlikely to deliver the goods.

Fortunately, it’s not all doom and gloom. Whilst politicians and pundits may fret endlessly and needlessly over the impact of plurilingual students in our schools, the teachers, students and their families and neighbours are just getting on with it.

Research projects carried out by Charmian Kenner and colleagues at Goldsmiths College in London and Angela Creese and her team at the University of Birmingham are involving communities to explore the ways people are actually building on all the languages in their environment to get things done. The evidence so far suggests that everybody gains as a result.

People tend to forget that it was monolingualists who built the tower of Babel – let’s not blame multilingualism for its downfall.

The Conversation

Frank Monaghan, Senior Lecturer in Education and Language Studies, The Open University

This article was originally published on The Conversation. Read the original article.

Does ‘translating’ Shakespeare into modern English diminish its greatness?

Bosnia Hamlet-a Prince at the Ottoman Court.

Actors perform Shakespeare’s Hamlet at the National Theatre in Sarajevo late September 14, 2005. The theatre was packed at the Sarajevo premiere of the play, in which Hamlet was shown as a Muslim prince at the Ottoman court, a reflection on the world after the September 11, 2001 terror attacks on New York, according to the play director Haris Pasovic. The play was a multi-national effort, including artists from seven countries. REUTERS/ Danilo Krstanovic FR05090010 DSS/AA – RTRO3AM

Sheila T Cavanagh, Emory University

An uproar ensued after it was reported that the Oregon Shakespeare Festival (OSF) – southern Oregon’s 80-year-old annual theatrical extravaganza – would be commissioning playwrights to “translate” all of Shakespeare’s plays into modern English.

The project drew jeers from Shakespearean professors, arts practitioners and others who believe passionately in the power of Shakespeare’s original texts, who abhor any attempt to “dumb down” their language.

OSF Director of Literary Development and Dramaturgy Lue Douthit and OSF Artistic Director Bill Rauch maintain that OSF is undertaking a bold, not sacrilegious, experiment. Nevertheless, howls of outrage have followed what Douhit ruefully has deemed a “career-ending” announcement for those involved.

As an educator and lover of Shakespearean drama, I remain committed to the value of presenting Shakespeare’s plays in their original language. I require my students to read Shakespeare’s plays in their original form, and through my work on the World Shakespeare Project, I’ve witnessed undergraduates in places such as Uganda, rural India and Buenos Aires enthusiastically respond to the challenge.

Yet the outrage over the OSF’s new modernization project is misguided. The organization – which is known for experimentation – is simply participating in larger, centuries-long tradition of molding, melding and adapting Shakespeare’s original texts.

Shakespeare for dummies?

Among those criticizing the new project is Columbia University Professor James Shapiro, a prominent Shakespearean scholar who maintains that “by changing the language in this modernizing way…it just doesn’t pack the punch and the excitement and the intoxicating quality of [the original] language.”

Earlier this month, before an audience at Shakespeare’s Globe, he added, “It’s a really bad idea.”

Notably, however, Shapiro (along with many others) responded quite differently to the translation of a different classic text. On Nobel Laureate Seamus Heaney’s oft-praised 1999 rewriting of Beowulf, Shapiro wrote in The New York Times:

Examples like this add up to a translation that manages to accomplish what before now had seemed impossible: a faithful rendering that is simultaneously an original and gripping poem in its own right.

In this instance, at least, Heaney’s talent apparently overcame Shapiro’s objections to the concept.

The playwrights the company has commissioned to “modernize” the language of Shakespeare’s works may or may not achieve the majesty attributed to Seamus Heaney’s Beowulf. But for whatever reason, changing the language of Shakespeare remains an anathema, while the setting, costuming and theoretical conceptualization of his plays are fair game for innovation.

The hottest theater ticket in Britain at the moment, for example, is Benedict Cumberbatch’s Hamlet, which caused similar outrage for opening with the famous “To Be or Not to Be” soliloquy, rather than the traditional “Who’s there?.” By the end of previews, the speech was moved back to (one of) the places it traditionally resides. Cumberbatch’s audiences have been comparatively silent, however, about the production’s addition of modern props, like a phonograph player.

London’s Young Vic Theatre, meanwhile, is currently presenting a strong version of Shakespeare’s Measure for Measure, with a set filled with dozens of naked, anatomically correct, inflatable dolls. Like the phonograph player on the set of Hamlet, it’s unlikely that theatergoers will object to the dolls, nor will they protest the video screens employed during the performance.

But when it comes to changing the language – well, the main objection, it appears, stems from concerns that it will encourage series such as Shakespeare for Dummies or No Fear Shakespeare, which presents original Shakespearean text adjacent to what its editors call “the kind of English people actually speak today.”

Such projects are understandable, if worrisome. Shakespeare does have a reputation for being too dense for ordinary people to easily comprehend.

At the same time, there are many remarkable projects that bring Shakespeare’s plays to even the most unconventional audiences. There’s Curt Tofteland’s Shakespeare Behind Bars, which offers prisoners the opportunities to present full-length Shakespeare plays, while former Royal Shakespeare Company artist Kelly Hunter’s project Shakespeare’s Heartbeat uses Shakespearean drama as the basis for games designed for children with autism.

Play on!

It’s worth noting the OSF is not planning to replace Shakespeare’s original texts during its current presentation of the complete Shakespearean canon, which will take place over the next decade.

While the company hopes that the newly commissioned versions of Shakespeare will be performed in Oregon and elsewhere, they also retain their commitment to presenting the conventional texts, albeit with regular tweaks and cuts.

As Shapiro and many others admit, Shakespearean drama has been altered, rewritten and reimagined repeatedly since the plays were first presented during the reigns of Elizabeth Tudor and James Stuart.

‘Is life even worth living? That’s what I keeping wondering…’
Dylan Martinez/Reuters

During the English Restoration, King Lear was given a happy ending. More recently, the 2001 film Scotland, Pa. offered a modern retelling of Macbeth, set at a fast food restaurant. Henry IV found itself placed among male prostitutes in Oregon in Gus Van Sant’s 1991 film My Own Private Idaho. Even Justin Kurzel’s acclaimed new film Macbeth opens with a twist: the funeral of Macbeth’s toddler.

The best adaptations – West Side Story, the musical Kiss Me, Kate and the Japanese film Throne of Blood – thrive. The bad, silly and unfortunate – Romeo and Juliet: Sealed with a Kiss and Animal Planet’s Romeo and Juliet: A Monkey’s Tale – fall by the wayside.

As poet Andrew Marvell might say, there is “world enough and time” for any number of Shakespearean adaptations and iterations.

While Shakespeare’s original language is remarkably rich and compelling, like Cleopatra, “age will not wither it.” Neither will OSF’s revisionary experimentation.

The Conversation

Sheila T Cavanagh, Professor of English, Emory University

This article was originally published on The Conversation. Read the original article.

Seven things to consider before you buy into phonics programs


Misty Adoniou, University of Canberra

Phonics, or teaching reading, writing and spelling through sounds, is often touted as the golden path to reading and writing.

National curricula in England and Australia have been rejigged to increase their focus on phonics, and entrepreneurs and publishers have rushed to fill the space with phonics programs and resources.

But before you buy their wares, consider the following.

1. English is not a phonetic language

This may be an inconvenient truth for those promoting phonics programs, but English is not a phonetic language and never has been.

English began about 1500 years ago as a trio of Germanic dialects brought over to the islands we now know as the British Isles. Latin speaking missionaries arrived soon after to convert the pagans to Christianity. They also began to write the local lingo down, using their Latin alphabet.

The Latin alphabet was a good phonetic match for spoken Latin, but it was not a good match for spoken Old English.

There were sounds in Old English that simply didn’t exist in spoken Latin, so there were no Latin letters for them. And there were sounds in Latin that didn’t exist in Old English, which left some Latin letters languishing.

Those letters were repurposed and some new letters were introduced. It was a messy match, and 1500 years of language evolution has only increased the distance between the sounds we make, and the letters we write.

As a result, English is alphabetic, but not phonetic. There is a simple sound letter match in only about 12% of words in English. How much of your literacy programming and budget do you want to allocate to that statistic?

2. Sounds are free

The sounds and letters of the English language are the ultimate open access knowledge. Buying them in a packaged program is just a con.

If you weren’t shown the sound-letter relationships in your teaching degree, shame on your degree, but in any case you can Google them or find them in the preface of a good dictionary.

3. Knowing your sounds is not the same as reading

I know all my sounds in French. I even sound reasonably convincing – in an Inspector Clouseau kind of way – when I “read” French. But I have no comprehension, so I’m not really reading.

Children who are failing in literacy in upper primary and high school are not failing because they don’t know their sounds. They are failing because they can’t comprehend.

Observe their attempts to read, write and spell and one thing is very clear – they know their sounds, and they over rely on them. Give them a phonics program and you are giving them more of what isn’t working for them.

4. Politicians are not educators

The push for phonics in England and Australia was spearheaded very conspicuously, almost personally, by the respective former Education Ministers Gove and Pyne. Politicians may have many skills… but they are not educators, and they are not educational researchers.

Educational reforms should not be shaped by personal predilections or political agendas.

5. Programs get it wrong

The narrow focus on sounds and letter patterns in phonics programs obscures more useful information for learning to read, write and spell. On occasion the material presented is just plain wrong.

A popular phonics workbook offers the following explanation for the word “technician”.

“Technician is a technical word. Although it is pronounced ‘shun’ at the end, it belongs to the word family ending in ‘cian’”

Teaching “cian” as a word family is linguistically inaccurate, and fails to teach how the word “technician” actually works.

“ian” is the suffix we attach to base words ending in “ic”, to turn them into the person who does the base word. So “technic” becomes “technician”, “magic” becomes “magician”, “electric” becomes “electrician” etc.

This knowledge develops spelling, builds vocabulary and increases reading comprehension. Being told that “cian” makes the “shun” sound does none of this.

6. Colouring-in is not literacy

Sticking balls of crepe paper on the letter “j” is not a good use of literacy learning time. Neither is colouring in all the pictures on the worksheet that start with “b”, particularly if you thought that picture of the beads was a necklace. And is that a jar or a bottle?

Busy work does not teach children to read and write.

7. There are no easy routes to literacy

Learning to read, write and spell is complex. The brain is not hardwired for literacy in the way it is hardwired for speech.

Each individual brain has to learn to read and write, and because our brains, our genes and our environments are all different, the pathways to literacy that our brains construct will be different.

If a single program could respond to this diversity then we would have solved the literacy problem a few hundred years ago when printed texts for the masses first took off.

Of course there are accounts of students whose progress was turned around by a phonics program – the comments section of this post will no doubt have some of those testimonials – but there are many more who languish in those programs.

Phonics programs can be helpful for students with very particular learning needs, but solutions to pointy end problems are not helpful for all learners.

The alternative?

Consider what the problem is that you are trying to solve before you commit to buying a phonics program.

If the problem is your students write phonetically, and cannot read phonically irregular words, then more phonics is not the solution.

If the problems are reading comprehension and quality of writing, then invest in your library and your staff. Buy quality literature and spend money on professional learning.

The Conversation

Misty Adoniou, Senior Lecturer in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.