Skip to main content
Language in education

Dodgy data and language misdiagnosis

By March 11, 2015May 28th, 20194 Comments5 min read4,289 views
2014 NAPLAN Year 3 Numeracy test results by language status

2014 NAPLAN Year 3 Numeracy test results by language status

In Australia the results of last year’s round of student and school performance on national standardized testing have just been published. As is the case each year since standardized testing was first introduced in 2008, the results of the National Assessment Program – Literacy and Numeracy, or NAPLAN for short, throw up a strange anomaly with regard to language. As an example of this anomaly, let me quote from the summary of the national Year 3 results:

Across Australia, and in all jurisdictions except the Northern Territory, there is very little difference between these two groups [Language Background Other Than English vs. English Language Background] in the percentage of students who achieved below the national minimum standard in any achievement domain. In the Northern Territory, the proportion of students from a language background other than English who achieve below the national minimum standard across the five domains is generally three to five times as high as for students from an English language background. (p. 63)

How strange is that!? Language status makes no difference to literacy and numeracy performance in Australia overall but it makes a significant difference in the Northern Territory? How can this be?

Let’s start with the meaning of ‘Language Background Other Than English.’ Abbreviated as ‘LBOTE,’ it is part of the personal data collected about test takers and each test paper has a bubble to be shaded if ‘either the student or a parent/guardian speaks a language other than English at home.’

You do not have to be a social scientist or a linguist to see that this is a pointless category to have: a test taker who is considered ‘LBOTE’ could be someone who is a monolingual speaker of Standard English (with a parent who speaks another language), a bi- or multilingual speaker whose repertoire includes Standard English or some other form of English, and, finally, a monolingual speaker of a language other than English who has no proficiency in English whatsoever. In short, having ‘LBOTE’ status does not say anything about the proficiency of the test taker in Standard Australian English, the language in which the test is administered.

Because ‘LBOTE’ is a meaningless category, it is not surprising that it results in weird correlations: across Australia, there is little difference in the test results of ‘LBOTE’ and ‘non-LBOTE’ groups although ‘LBOTE’ students actually slight outperform ‘non-LBOTE’ students, particularly by Year 9 and particularly in mathematics. This situation is different only in the Northern Territory where ‘LBOTE’ students perform significantly lower than ‘non-LBOTE,’ as we have seen.

The obvious explanation is that most ‘LBOTE’ students across Australia are fluent bilinguals but that the situation is different in the Northern Territory, where most ‘LBOTE’ students are not proficient in English. We have discussed previously how NAPLAN testing discriminates against creole speakers in the Northern Territory.

Why do we accept a meaningless category such as ‘LBOTE’ to be used in national reporting and why do we put up with being presented with nonsensical correlations between ‘LBOTE’ status and academic performance year after year?

ANU linguists Sally Dixon and Denise Angelo found that NAPLAN is not alone but that data held by schools about the language status of their students are generally ‘dodgy,’ nonsensical and illogical. In a survey of 86 schools in Queensland they discovered that only two out of these 86 schools felt reasonably confident that the language data they held about their students were accurate. In addition to the ‘LBOTE’ status of NAPLAN test takers, schools also recorded a ‘main language other than English’ on enrollment in forms that were variously filled in by parents or administrators as they saw fit. If ‘main language other than English’ was left blank on the enrollment form, ‘English’ was sometimes entered on transfer into the database instead of a null response. Some students also received ‘English as an additional language or dialect’ status. This category was variously assessed by teachers if and when students seemed to have problems and funding for additional English language support was available.

These three categories were internally incoherent and did not match across categories in 84 out of 86 surveyed schools. This shocking finding is due to the fact that language-related categories are poorly defined, as we saw in the example of ‘LBOTE.’ It is also related to a general language blindness in schools, further evidence of the monolingual mindset of Australia’s multilingual schools. Furthermore, schools were particularly ‘language-blind’ when it came to indigenous children: creoles and contact varieties were not necessarily recognized as anything other than ‘English,’ even if judged to be ‘bad English.’ By the same token, some students with clear ethnic affiliations were categorized as speakers of the ethnic language irrespective of their proficiency in that language.

The overall consequence of all these ‘dodgy data’ floating around in relation to language is that educators come to see language as meaningless because it does not really distinguish between one group and another. Overall, ‘LBOTEs’ and ‘non-LBOTEs’ seem to perform more or less the same, and the same seems to be true of ‘MLOTEs’ (‘main language other than English,’ in case you have lost track) and ‘non-MLOTEs,’ and ‘EAL/Ds’ (‘English as an additional language or dialect’) and ‘non-EAL/Ds.’ However, it is not language that is meaningless as a factor in student performance. It is dodgy data that create this illusion. The very fact of proliferating data categories more or less referring to the same status will inevitably leave people confused and unwilling or unable to take ‘language’ seriously.

Language-related issues then become displaced onto race or socio-economic disadvantage. Educators and the general public fail to see a child needing English language support in order to achieve academically. Instead they see an aboriginal child achieving poorly. We then collectively throw up our hands in despair and decide that indigenous education is a problem that is too big and intractable to fix. Because there is nothing we can do, we might just as well ignore the problem for another year.

However, this is adding insult to injury. The misdiagnoses of the language-side of the academic failure of indigenous children are a significant part of the problem. A first step to address that problem would be to get our data in order. This entails compulsory language-related training or qualification requirements for Australia’s teachers, test designers and policy makers.

Reference

Dixon, Sally, & Angelo, Denise. (2014). Dodgy Data, Language Invisibility and the Implications for Social Inclusion: A Critical Analysis of Indigenous Student Language Data in Queensland Schools. Australian Review of Applied Linguistics, 37(3), 213-233. [open access available]

Ingrid Piller

Author Ingrid Piller

Dr Ingrid Piller, FAHA, is Distinguished Professor of Applied Linguistics at Macquarie University, Sydney, Australia. Her research expertise is in bilingual education, intercultural communication, language learning, and multilingualism in the context of migration and globalization.

More posts by Ingrid Piller

Join the discussion 4 Comments

  • Brendan Kavanagh says:

    Around one third of the Northern Territory’s population is Aboriginal, with many people living in remote communities. English is not the language of these communities. The clients at the shops, the clinic, the shire office all speak a different language, yet our lens for achievement is based purely on NAPLAN testing via the Closing the Gap report. There is no onus the visiting teachers or store managers to learn the local language, despite this being the language of the local economy.
    There is also much hype around the success of the Literacy for Life foundation, which is highly present in the latest Closing the Gap report. While I respect their community controlled model, it should be noted that the communities that they have operated in are English speaking communities.

  • Sophia says:

    You are right in that educators and the general public often fail to acknowledge the need of a child to receive language support in order to achieve academically.

  • I know the US context for education of culturally and linguistically diverse learners. The issues raised in this blog resonate with the US context. Often the solutions that are provided to these learners are focused on remediation, decontextualized skill-based curriculum about which I wrote in my blog. Thank you for providing excellent research on matters of language and education.

Leave a Reply