Apples and Pears

March 2010

 

Facebook       Tweet

League tables are a bogey for many in the school sector, but inter-school comparisons have their place, reports JOHN GERRITSEN

Teachers are normally wary of comparisons between schools. Such exercises are, they say, notoriously unreliable. A school with a roll of students from well-heeled backgrounds will do better than a school with students from poor backgrounds, but what the league tables generally fail to convey is how much ‘value’ schools have added to their students. Is the apparently high-performing school really doing well or is it simply cruising on its students’ abilities and background? And what about that ‘poor’ school – is its performance below average or an amazing feat of teaching that has lifted its students’ achievement well above what might normally be expected given the skills and knowledge with which they entered school?

Such issues are currently under debate due to concerns that the national standards for primary schools will result in league tables. But in the secondary sector, league tables based on NCEA results have long been available. The news media seem to have got part of the message and generally compare schools of similar deciles and it is worth noting that the annual Metro magazine rankings assess schools on a range of weightings in order to make comparisons between schools of different deciles.

And it is not just the media that is involved. The New Zealand Qualifications Authority’s website allows visitors to compare the results of individual schools with those of other schools or groups of schools. Schools themselves also make such comparisons in order to check their performance against national averages and averages for similar schools.

So when are school-to-school comparisons valid and when are they not? The answer from the secondary school sector centres on context. A ranking or level of performance is meaningless without knowing the factors behind that performance.

Secondary Principals Association president Peter Gall has no time for league tables that focus on NCEA pass rates by year level, for example, the proportion of Year 11 students who pass NCEA level 1. It’s an old-fashioned approach based on the premise that students should be passing a particular NCEA level in a particular year and quite divorced from the reality of the qualification.

For example, Gall’s decile 3 school has a relatively low level 1 NCEA pass rate among its Year 11s but there is a good reason for that – the school exposes students in that year to a lot of unit standards so they have better pathways available to them in future. Then in Year 12, students pick up both their level 1 and level 2 certificates.

More important than year level pass rates, he says, is the endpoint – the students who leave a school in a particular year. “We should be focused on that cohort of students that leave school in any one year. What do they leave with? What do they leave without?”

Gall is also dismissive of weighted league tables. He says there are so many pathways students can take to reach the endpoint that is right for them, that a league table, even with weightings, can’t capture.

“You’ve got to be very careful with NCEA data because there is so much data there.”

Asked if parents use league tables to help choose schools for their children, Gall says he expects that they do, but only to confirm their thoughts. “They look at

them to satisfy their decision already made.”

Michael Loretz is deputy principal at Mount Roskill Grammar in Auckland, a decile 4 school that does well in the Metro league table. He says the danger of league tables is that there is no ‘story’ about what each school is doing with its students.

And if parents start using league tables to choose schools, schools that are not doing well will start to fudge their results in order to improve their standing on those tables. “Then the whole game becomes a different game.”

“As a profession we need to focus on our dialogue with our parents and make sure it’s really rich.”

Asked if league tables have been damaging for secondary schools, Loretz says some schools have struggled to get positive stories out.

“Teaching and learning isn’t about a number at the end of the day... there is always more to say than just a number.”

But Loretz says his school, like many others, compares its performance in the NCEA each year to national averages for its own purposes. Such comparisons are not helpful for planning the teaching and learning of individual students, but they do provide a benchmark to aspire to.

The work involved in creating such comparisons is considerable, and an Auckland maths teacher, Wayne Atkins, is hoping to find a ready market for a new product providing in-depth inter-school comparisons of NCEA results.

Launched this month, the ‘On Your Marks’ DVD providing information gleaned from 2008 NCEA results has been sent to almost every secondary school in the country. The DVD allows schools to compare their performance with that of others by type of school, subject, and even individual standards and Atkins hopes the free sample will generate orders for a disc with 2009 results.

But Atkins’ aim is not to provide the basis for league tables. Rather, he is interested in selling principals and boards of trustees a way of spotting where things might be going right or wrong in their school.

Atkins has first-hand experience of the ‘wrong’. He was a teacher at Cambridge High School in 2000 where he found appalling misuse of internal assessment under the old School Certificate system. For example, students likely to do poorly in external economics exams were credited with higher internally assessed results than they had actually received in order to ensure that they would pass the subject, thus giving the school an artificially high pass rate. The practice continued until a complaint in 2004.

He says his disc will help schools identify where their internal pass rates might be abnormally high or low compared to similar schools.

“A board owes it to their students to investigate the high results and the low results.”

But he warns that the information on the DVD is only a starting point. It does not identify that a school is doing something wrong, only that an area is worth looking into further. Such warning bells should ring if a school’s internal pass rates in a particular standard or subject are much higher or lower than their external pass rates and if their pass rates are significantly different from those of similar schools, he says. He notes that high internal results could well be an indicator of great teaching, or a strong emphasis on content that is well suited to the student content.

Comparison with similar schools is particularly important, he says, noting that some anomalies are easily explained by comparing schools with the right peers. For example, a school using the Cambridge International Exams (CIE) might appear to have a low NCEA pass rate until compared with other CIE schools. Similarly, a boys’ school might appear to have an overly high proportion of internal assessments until compared with other boys’ schools. Another important factor is the number of students sitting a particular standard. A high pass rate in one standard could be due in part to a relatively small number of students sitting that standard, while another school might have a lower pass rate because it puts large numbers of students through the same standard.

Atkins also says figures can be skewed by schools “banking” their results – reporting the results of a group of students the year after they actually achieved those results.

He says schools’ resistance to inter-school comparisons or league tables is because often such exercises do not compare similar schools. Often, league tables weight their results, for example giving more points for Excellence results than for Merits and a lot depends on how much weighting is placed where.

Atkins says his experience creating a system for comparing school NCEA results has shown him that rankings of schools should be approached with extreme caution.

“In most cases a league table that ranks one school above another school is probably very dubious,” he says.

“I know parents are very keen to have one measure, but the idea you can have one measure to rank schools is dubious if not dangerous.”

He notes that a key point is the approach schools take to the NCEA. “Every school uses NCEA for a different purpose. It is very difficult to make valid comparisons between one school and another.”

He says his product does not so much compare schools, as subjects within schools. And it is not a product for parents – if they want to know more about a school, they should talk to teachers and managers at that school.

It is likely that there will always be popular and unpopular schools, with league tables playing a role in that dynamic. The experience of the secondary school sector suggests that comparisons between schools have a place, it’s just a matter of keeping them in context.


Post your comment

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments