It is laughable that John Banks told Cabinet in July 2012 that “A strong evaluation programme will be put in place that thoroughly examines the impact and effectiveness of the first such schools. This will enable us to make informed decisions about whether or not to open further such schools in the future.”
The 3-part evaluation of charter schools has failed in key respects to deliver on what Banks promised.
First, there is absolutely no attempt in the final report to evaluate the most important outcome, which is student achievement.
Instead, we get some wishy washy statement that : “MartinJenkins has worked with the Ministry to refocus the final year of the evaluation (away from a primary focus on outcomes) because it was still too early to determine “success”: schools/kura were still becoming established, numbers of students that had received a “full dose” of the PSKH intervention were low, and efforts were ongoing by the Ministry to define and agree contracted outcomes.”
Charter schools were touted as having “freedom from constraints imposed on regular state schools in exchange for rigorous accountability for performance against agreed objectives.”
So, if these schools had agreed objectives in their contracts from the outset, where is the rigorous analysis of how they have performed? And why would the Ministry of Education still be “defining and agreeing contracted outcomes” if the schools are in their 3rd or 4th year of operation?
The real answer is simple: they have not performed as expected.
The primary content of this expensive exercise was not a strong evaluation of impact and effectiveness but instead they turned to a weak gathering of “survey” data from students and whanau.
But even this part was an embarrassment for charter school supporters.
“Low response rates to surveys and selection bias meant we were not able to examine student and whanau perspectives from all angles or across all schools/kura.”
Five schools (out of eight) were included in the surveys of students, but the responses from three of them were so low that they were excluded. So, instead, they resorted to merely including the responses from the two Villa Education Trust middle schools as a “case study”. Wow!
But the problems did not stop there.
“Limitations in the administrative data meant:
- Attendance data was not sufficiently robust to be included;
- We were unable to compare quality of outcomes with outcomes that had been achieved at previous schools or to accurately identify where students went after exiting.”
The comment about the attendance data was interesting, given David Seymour’s press release only last month that charter schools outperform state schools on attendance.
There are numerous other problems and gaps with the MartinJenkins report and they can be discussed more fully in due course.
Overall, it is clear that this exercise has simply not produced the thorough examination that was promised by John Banks, let alone enabling informed decisions before opening further such schools.
It was clear from the outset that the charter school policy was driven solely by political ideology, and Mister of Education, Chris Hipkins, was right to dismiss the evaluation report as being of no real value to policy makers.
Maybe the very poor survey response rates – even from those closely involved in these schools – send a clear message: it is time to move on.
~ Bill Courtney, Save Our Schools NZ