Weighing in on the public battleground, four academics from universities around Australia – James Ladwig, Kevin Lowe, Colleen Vale and Ian Hardy – have shared their concerns and reflections on the value of the high-stakes test at a media briefing run by the Media Centre for Education Research Australia (MCERA). 

EducationHQ sat in on the proceedings, which honed in on the idea that while NAPLAN data is limited in its scope to measure students’ actual  learning progress, there will be consequences if it were to be abolished completely.  

Here’s a summary of what each of the four panellists had to say: 

Associate Professor James Ladwig, Newcastle University

NAPLAN data needs to be removed from the political realm and put back in the hands of those who can use it effectively, Ladwig says.

And politicians need to “think more carefully” about how they use and share schools’ NAPLAN data on the MySchool website.

 Although the argument put to the public is that displaying the results establishes a “transparency” across schools, the expert says it’s “one of the worst uses” of data he’s seen.

“It’s contributed more to a misunderstanding of what’s going on than an understanding,” he tells participants.

Ladwig notes that NAPLAN was not designed to be a high-stakes test, and the data it generates has “big margins of error”.

His message to policymakers is clear: NAPLAN does not provide a reliable measure of individual student achievement, and should not be used to drive choices for a student.

He has a word of caution for principals too: do not accept NAPLAN test results as a KPI. It takes about five years of data to build up any sort of meaningful indictor of progress.  

And to parents concerned about the anxiety the test induces on their children, Ladwig’s words are unambiguous.

“It isn’t mandatory, if it’s causing your child stress, take them out!”

Dr Kevin Lowe, Macquarie University 

Next to take the virtual stage, Lowe is keen to show that NAPLAN results are useful, because they paint a very real picture of the systemic marginalisation of Indigenous students and the inequality that’s rife in our school system. 

The data indicates that by the time Indigenous students are heading towards the middle stage of their secondary education, their literacy and numeracy capabilities are up to three years behind their non-Indigenous peers. 

Some are lagging up to seven years behind, Lowe says. 

And while this is a “well-known systemic issue”, thus far we’ve attacked it with a range of ineffective strategies, Lowe states.

The achievement gap can’t be fixed with “silver bullet” literacy and numeracy programs; what’s needed are direct interventions at the whole-school level to create an education that resonates with Aboriginal students and their communities. 

Lowe is concerned that if NAPLAN is dissolved entirely, then cohorts of Indigenous students will continue to slip through the cracks and finish their schooling without basic literacy and numeracy skills. 

Professor Colleen Vale, Monash University

Speaking on NAPLAN’s ability to gauge students’ mathematical reasoning skills, Vale says the high-stakes test should be considered just one small slice of the assessment pie.  

Firstly, Vale believes NAPLAN testing is conducted at the wrong time of year.

Research shows, she says, that most learning occurs during Term 2 and 3, and tends to stagnate in Term 1 and 4.

Thus, testing students at the beginning of Term 2 is not giving us a clipped snapshot of the depth of learning a child achieves over the course of a year. 

When it comes to measuring mathematical understanding, Vale contends that NAPLAN’s effectiveness is “debatable”. 

Teachers and parents need to focus on a child’s individual growth in their learning, and not on where they sit in the standards. 

We need to draw on a range of assessments to truly map mathematical proficiency, including students’ own self-assessment of their progress. 

Teachers need to be afforded more time to analyse and reflect on all the data that’s being collected, and to plan how they will address the issues it highlights. 

For parents, NAPLAN data isn’t very helpful, Vale says.

This is because year-to-year, a student is likely to stay in the same place on the graph.

It’s up to schools to really emphasise other data and learning that has accumulated over the year, and to talk to parents about all the different examples of their child’s progress.

In short, we need to shift from a focus on ‘deficit’ – what children don’t know – to one which tests and shares what students do know. 

However, despite NAPLAN’s limitations, it’s crucial to have in place a “big picture” analysis of where all kids are at in their learning, Vale says. 

Dr Ian Hardy, University of Queensland

Teachers are drowning in a “deluge of data”, at least that’s the impression he's gathered.

His research has sought out the voices of teachers at the classroom coalface to unveil their feelings about NAPLAN and the type of learning culture it creates. 

Hardy says teachers felt “up to [their] eyeballs in data”, and locked in a cycle that demanded “continuous improvement” with a focus on “moving data” all the time.

They also reported feeling forced to form “aspirational goals” for their students, which may not be realistic. 

Data has become “a four letter swear word” in schools, Hardy concludes. 


So what's the NAPLAN sentiment in political circles?

Get Simon Birmingham and Rob Stokes' take on the controversial testing regime here