How long do we have to wait before OfSTED, in its current form, becomes untenable?
[This is an anonymous blog published by #TeachingSecrets on behalf of a deputy headteacher.]
As I start writing this piece, I find it ironic to receive a circular email inviting me to a “DfE presentation piece on Multi Academy Trusts (MATs)”. The opening line is this:
‘Over half of England’s academies are within a multi-academy trusts (MATs), and this number is set to rise with the government having a strong preference for this model.’
‘A strong preference’?
This has been the line taken by the Department for Education since the Government failed to present a bill to parliament, forcing standalone and Local Authority schools to become part of a MAT. There is obviously a concerted effort to use other forms of influence now that the coercion route has been rejected.
Little did I realise how compliant OfSTED would be in helping the DfE to achieve that aim.
The Beginning of the End
I was never fearful of OfSTED and in the past I have found visits helpful. I have faced several over my time in education and still remember my first as a newly qualified teacher. I was in the first term of teaching and the HMI (Her Majesty’s Inspector) who observed me gave me great constructive feedback and helped me develop as a teacher. As time has gone on and I have moved up through the ranks I have found OfSTED to be far less helpful (with the odd outlier) and far more critical as the goalposts have changed year on year.
Despite this, I always thought that they were at least semi-independent of the Government and they were just constrained by Governmental accountability measures and the changes that occurred to them. Now I have become more cynical, or less naïve (take your pick) because, despite the former Chief Inspector’s well-publicised spats with power, it appears that OfSTED is becoming the very vehicle by which the move towards MATs will be achieved.
The reason why I think this is down to my most recent OfSTED visitation, in which it was quite clear there was an agenda, and the Mark Twain quote of ‘lies, damn lies and statistics’ could not be more true. Prior to the visit, after many years as a deputy headteacher, I was applying for headships in deprived areas where I have spent most of my career. Like many, I felt a vocational sense of giving back and trying to improve social mobility despite the obstacles that presented themselves. By the end of this visit and the subsequent intransigence throughout the complaints process, that is so clearly not fit for purpose, I stopped applying for those posts. I started applying for posts in the independent sector and abroad.
My rationale for this was that however hard we work, however many sacrifices we make in terms of personal health and family life, the deck is always going to be stacked against us.
I was reminded of a colleague, a long serving principal from the North West of England whom I had previously met on the interview circuit and for whom I have great respect; his perception was that you could predict an OfSTED judgement based on the percentage of free school meals. He is right; there are exceptions to this but most schools/academies on a 3 or 4 judgement serve deprived areas.
The story of the visit was one of a preconceived idea of what the school was like and then seeking out evidence to justify that idea. This mainly stemmed from the public data which was wrong.
How can the public data presented in the mighty dashboard and raise documents be wrong you may ask? The OfSTED team didn’t get it either, especially the lead inspector from a primary background who had never held the top job and was leading a team inspecting a large secondary. The lead knew their framework back-to-front and used every applicable section to justify decisions, but they had no real understanding of the issues that a school like ours had to deal with, especially within the context of the changing accountability measures.
The starting point was the Data Dashboard which was used almost exclusively throughout the inspection, despite Sean Harford’s public guidance, saying “the dashboard should be used for one day inspections and RAISEOnline should be used for two-day inspections”. RAISE was never used and was kept out of discussions despite our attempts to show the copious amounts of ‘green data’ on it.
The issue with both sets of data is that we had several students included in it that were not supposed to be on it. We run an alternative provision on-site which serves the surrounding community, and for some reason the students’ results were designated as ours. The impact of students who could not handle mainstream education being added into our results had a significant effect on virtually all groups in our results set. We had to go through several approaches to the DfE to get these results changed who eventually realised the error. Our Performance table data has subsequently been corrected to reflect this and the impact has been moving from below national average to average. We had tried to show this to OfSTED and highlighted this during the so-called complaints procedure but no one would listen.
So to the ‘statistics’ then.
Our 5A*-C E/M (English/Maths) went up 6%. Our E/M combined was 64%, both English and Maths APS (Average Point Score) went up by 2 points and our maths – which was criticised in 2015 – improved for every student group; disadvantaged and all prior attaining groups. Our page 37 of RAISE showed our disadvantaged students were significantly above national in EBacc subjects, English, Science and Languages with only Humanities significantly below. (This, however, changed after the DfE rectified our cohort and the humanities’ progress 8 score dramatically improved).
Our overall disadvantaged P8 score was above the national disadvantaged score and our non-disadvantaged students were above national but, as you are probably aware, disadvantaged get compared with non-disadvantaged nationally. After the DfE rectified its error, that gap was 0.27 (not exactly huge).
“So, what does this all equal?” you ask.
Special measures of course.
Yes. It seemed strange, that after the DfE changed only one P8 score which was significantly below national on the data dashboard (you know – the one not supposed to be used in two-day inspections) that group was High Prior Attaining Disadvantaged students. They were -0.9 which is not good by anyone standards, I agree. “Outliers?” I hear you ask? Why yes, we had outliers and as Sean Harford has said, these should be taken into account when they significantly skew results. All our outliers were in this group (not surprisingly due to the nature of the group).
We had just over 200 students in our year 11 cohort. Of this intake, 32 were in the ‘HP attaining disadvantaged’ group and of these, 7 were outliers. When we talk outliers, we’re talking serious issues like sex exploitation, parents incarcerated, moving house (3 times in the year), drugs, gangs, mental health problems, etc. It was a miracle that some were still alive and yet we were supposed to not just get them in an exam room but actually achieve top grades.
Makes No Difference
“Surely this must have been taken into account when presented to the lead inspector?” I hear you ask.
Not a chance: to paraphrase “… you have so many disadvantaged students that these few won’t make much difference overall.” … I beg to differ. The P8 for that group went from -0.9 to -0.14 (no longer significantly below). Without the two female students in this group, all the female related stats became positive. Yet none of this was taken into account.
“But OfSTED are supposed to take into account all existing data, aren’t they? And you can show this in any format you want?”
That’s what OfSTED say!
All our year groups showed that the majority of subjects demonstrated ‘narrowing the gaps’ for SEND students and for disadvantaged students. However, if the inspection team don’t like the way you have presented the data, then it is worthless, regardless of what it shows; therefore it didn’t count.
“Okay then, what about your sixth form?”
Our sixth form data showed increases everywhere, with disadvantaged students outperforming the others. The one silver lining was that our sixth form provision was deemed to be good. Yet despite the sixth form being good and being a quarter of the school in terms of numbers, and being taught by the same teachers, we were still special measures.
Basically, the data dashboard was used to justify everything. Especially the HP attaining disadvantaged cohort. Whatever argument we put forward to show progress, we were told that their was no improvement from the previous year (and lack the capacity to improve’). The lead inspector used the 2015 data to say that we had issues and when we showed ‘like-for-like comparable data’, we were told that it was ‘a shame that it is old money’.
“So, OfSTED can use old money but we can’t?”
It makes sense. Our whole performance came down to 7 students who the lead inspector refused to accept as outliers, despite the public guidance from Harford.
As many of you who have been in the situation of having ‘poor’ data, the inspection then takes a life of its own with a self-serving approach to find other evidence to justify the judgements being made.
The fact that our deputy for teaching and learning had to demand a meeting with an inspector to highlight good practice and show our quality assurance processes is incredibly telling. Why would OfSTED want to see good practice if they were trying to paint an image of special measures?
The complaints procedure was a farce.
The system provides for special measures judgements to be quality assured ‘in-house’ without any outside evidence from the school muddying the waters, and then when you are finally able to protest against ‘the process’ (after the report is published), the limits put on you are laughable – Kafka would be proud.
Okay, so what has all this got to do with MATs?
Any school or stand alone academy placed in special measures is automatically placed into a MAT. We reluctantly accepted this and had talks with a potential MAT in the area, and who seemed to be receptive to us joining them. When we put forward our preferred bidder, the DfE said “it is not the school’s preference that counts, it is the DfE who chose the preferred partner”.
The rate at which this happened was extremely quick; one could even say, with indecent haste. It makes one think that ‘someone had decided which MAT we would join’ a long time ago. Conversations behind closed doors, where influential people talk to other people in positions of power.
“But, how could they achieve that aim?”
Oh I know. Place the single-status academy, sitting on a £300M plot of land into ‘Special Measures’.
“Surely not. Who would ever think such a thing?”
The irony of becoming a MAT, is that the school won’t be in Special Measures any more, because obviously all those so-called inadequacies suddenly disappear with one wave of the magic wand when the school is re-named with a new URN (Unique Reference Number). And, within 3 years when the ’new’ academy is finally inspected, the likelihood is that it will be judged ‘Good or Outstanding’ to demonstrate the MATs are responsible for school improvement. ]
I suspect the ‘most vulnerable’ from the ranks of disadvantaged and SEND students will be culled in the first few months, then the results will improve as they would within any school. Sadly, I’m not sure what may become of these students or the schools in the surrounding area that have to accept them, but as long as the MAT looks great and “OfSTED look like they’ve made the right call”, then what does it matter …
And you wonder my I’m moving abroad.