@OfstedNews Say Sorry by @TeacherToolkit

Reading Time: 5 minutes

This is another blog about the invalidity and unreliability of Ofsted.

I will start off with our own Ofsted experience in September 2014; an inspection which has definitely led me to question the validity of Ofsted more than ever before. This blog has been 6 months in draft and following on from Sean Harford’s announcement at the ASCL conference (March 2015) about peer-to-peer inspections, it is now time to press the ‘publish’ button. Read @LeadingLearner‘s summary, Inspection’s Nearly Over! It’s Official.

peanut nut

In a nutshell and in my opinion, we were determined by our year 11 results, despite our sixth form, leadership and management, behaviour and safety being judged good. Our 5 *A-C (English and maths) is 3% below the national average (state schools only and above independent schools). The criteria for ‘good’ achievement states ‘from each different starting point, the proportions of pupils making expected progress and the proportions exceeding expected progress in English and in mathematics are close to or above national figures.’

If achievement was, in the light of national results, judged to be ‘Good’, then so would teaching and learning and the school overall. Being one of the first schools to be inspected with no lesson observation grades was an interesting experience; I think it is a positive development but it did mean that the chances of the achievement and teaching and learning judgement being different are even more remote. Something worthy to note; in December 2011 when our school was last inspected, we have very similar results (in line with very similar national average results) and yet, was judged Outstanding overall!

Our overall judgement is Requires Improvement, although I argue that it is not quite an RI school. Far from it, but this depends how a school is measured. There was no issue at the time of the inspection, but we then wrote a letter of complaint (in the absence of an appeal procedure) only  when national results had fallen by 4 points and ours had stayed the same. This was to challenge the decision. Our Ofsted response was thus;

“I am sorry that the emergence of national data gave you cause for complaint and I hope that this response has served to explain matters.”

Image: The Sunday Times
Image: The Sunday Times

From a personal and recent perspective, I urge Ofsted to review how schools are judged during the first half term of the autumn, prior to Raise Online being published. It is very significant that Ofsted liked what they saw in our classrooms. However, our headline results determined our overall teaching and learning grade.  If our results had been higher, then our grade for teaching and learning would have too. Therefore, if outcomes require improvement, then teaching and learning over time cannot be better, which makes one wonder, what significance no grading of lesson observations actually has on the outcome of an inspection?

Complaint:

[School] was inspected in September 2014. We appealed the outcome using the complaints procedure, when the national data was published which showed a significant fall in GCSE examination results. Without the fall we would not have appealed. The appeal was unsuccessful, stating that ‘due regard was paid to the information that was available to them (the inspectors) at that time’ and as the 2014 national results data was not available it was not taken into account. Although we were disappointed, this it is not the main point of sharing this blog.

Data Scrabble Letters

The outcome letter stated we had complained about the composition of the inspection team. We made absolutely no reference to this in our appeal letter stating that;

“We would like to emphasise that we had absolutely no concerns about the team, they were an excellent group to work with, and we are only raising these issues following the recent publication of national examination results.”

The high level of experience of the team did come up in our conversations with the panel who handled our complaint, but we did not raise it as a concern in any sense. For the record we were not asked whether we were concerned about it, and there was no check at the end of the telephone call as to whether we wished to add this to what we had stated in our letter of complaint. We requested that this was removed from the outcome letter, but this was refused by [the Principal Officer for the Inspection Quality and Complaints Administration] as detailed below.

“I am sorry that you are disappointed by this decision but ‎it would not be appropriate now to remove this from the complaint response.”

(12th December 2014)

I am very surprised about this, not least because if we had raised it as a concern it would have been entirely inconsistent with our feedback to the lead inspector at the time and in the form we completed later. We would not like the HMI who led the inspection, to think we had complained about something we could easily have mentioned at the time of the inspection and was given ample opportunity to do this. The Lead HMI was included in our response to the investigating officer.

If OfSTED keeps files for each school, we would like our letter to be included alongside the various communications about the appeal, so that our side of the story is at least captured for any future reference. We would be grateful if OfSTED could confirm this will be the case.

.

We are still awaiting a reply.

.

Inspection Handbook:

Paragraph 57 of the new inspection framework (January 2015 revision) makes matters worse, stating;

“Inspectors should note that the introduction of an early entry policy and changes in GCSE examination structure have had an impact on the 2014 Key Stage 4 results. The changes should be taken into account when considering results alongside those of previous years, as neither direct comparisons nor production of three-year trends are possible.”

Of course, our school 2014 results were compared to national 2013, because 2014 was not available at the time. This is seriously exasperating.

How many other schools have suffered the same outcomes as us?

Response:

The School inspection handbook (Ofsted ref. 120101), paragraph 4, states that “inspectors must use all the available evidence to develop an initial picture of the school’s academic performance. This includes data available to the inspectors at the time of the inspection.”

The evidence recorded before and during the inspection shows that due regard was paid to the information that was available to them at that time. In other words if we had been inspected later and the 2014 national results had been available, it could (not would) have been a different outcome. We had a strong case, albeit not a bulletproof one.

Having taken advice from a senior inspector and professional associations, neither are surprised, and in the case of the inspections specialist, we are one of many cases in this position. The word ‘lottery’ was used in the reply.

I thought it was time to share this blog on behalf of other schools with similar experiences. I think I’m safe to say, that after 6 months of silence, the Grim Reaper is back!

We must have a reliable Ofsted. Can we sort this out sooner, rather than later please …

Grim Reaper

@TeacherToolkit

In 2010, Ross Morrison McGill founded @TeacherToolkit from a simple Twitter account through which he rapidly became the 'most followed teacher on social media in the UK'. In 2015, he was nominated as one of the '500 Most Influential People in Britain' by The Sunday Times as a result of being most influential in the field of education. He remains the only classroom teacher to feature to this day ... Sharing resources and ideas online as @TeacherToolkit, he has built this website (c2008) which has been described as one of the 'most influential blogs on education in the UK', winning the UK Blog Awards (2018). Read more...

16 thoughts on “@OfstedNews Say Sorry by @TeacherToolkit

  • Pingback:Everything Indexed @TeacherToolkit | @TeacherToolkit

  • Pingback:@OfstedNews Say Sorry by @TeacherToolkit | Uxbridge College Teaching and Learning

  • 25th March 2015 at 7:19 am
    Permalink

    My sympathies, this so unfair. You have been judged by people that use data they don’t understand to make judgements that are indefensible in any logical way, ignoring all the good things going on that can’t be reduced to a set of numbers. From a personal viewpoint it makes very interesting reading. Particularly the grading being based on year 11 results. The school where my children go/have gone to went from ‘outstanding’ to special measures a bit over a year ago. As far as I can gather this was based on the under attainment of a few low ability yr 11’s and not gaming the system. They stayed with GCSEs while some other schools had these type of students doing equivalents. I remember my son coming home and saying “guess what Mum, we’re inadequate”. The pride students had in their school was severely dented. However, I have not met a single person associated with the school that believes Ofsted were right – not outstanding maybe, but certainly not inadequate either. It has created a massive amount of stress and work for staff and has been unsettling for students with Ofsted back in every few months. The school, surprise, surprise, is now converting into an Academy and will no doubt have the SM label removed in the not too distant future, with Ofsted taking credit for the school’s ‘improved performance’. This system must change.

    Reply
  • 25th March 2015 at 8:21 am
    Permalink

    I absolutely agree having had the same experience. For Ofsted to judge against 2013 data – both for comparison to national and also as a final outcome – is so wrong and reduces the process to a 10 minute conversation at the start of the process. The team always find the evidence to match the results data! We are RI but 10% above national for 5A* – C En and Ma!

    Reply
      • 27th March 2015 at 12:05 am
        Permalink

        Comparison to Nationals is irrelevant. It is about progress from starting points. Look at your schools Best 8 VA score, for fairest progress comparisons.

      • 27th March 2015 at 6:14 am
        Permalink

        Yes. Fully aware of this. Our 6th form gained the best results ever. Our Eng/Maths residuals are good. Will double check and re-reply

  • 26th March 2015 at 3:55 am
    Permalink

    The whole situation is ridiculous and unfair. We too were inspected in September and it was the 1st inspection by that team in the academic year. The inspectors who came to ks1 were highly critical; they said our Foundation Stage baseline assessment was too soon, that our HA were not being extended and our differentiation wasn’t good enough. They used last year’s data, with results for children who had completely different teachers to the ones they inspected. On comparison with this year’s data, the percentage of children getting a level 3 in maths and writing was significantly above national average and in reading was in line. We also received a payment from David Laws MP for the achievements our pupil premium children made.
    The lead inspector actually told our head that pupil progress counted for little and that attainment was what it was all about. We also noticed that part of the team had no understanding of the new curriculum.

    Reply
  • 26th March 2015 at 8:39 pm
    Permalink

    If teachers were to assess pupils’ progress using the same methodologies as those demonstrated in your recent Ofsted they would rightly be criticised by leadership and parents, and yet these anachronistic judgements seem to be all too common.

    Reply
  • Pingback:5 Reasons to Work at @QKynaston | @TeacherToolkit

  • Pingback:Education Panorama (April ’14) by @TeacherToolkit | @TeacherToolkit

  • 4th April 2015 at 9:36 pm
    Permalink

    Sorry to hear that – it does seem very unfair and you are right about inconsistencies. HAYLES – we were inspected in October by a team who understood the new curriculum and we were given a good as a result of all we were doing. It seems ridiculous that we should have such different experiences only weeks apart.

    Reply
    • 9th April 2015 at 8:02 am
      Permalink

      I’m glad to hear your experience was better tgillat. Our overall judgement was good but it was the worst ofsted we’ve ever experienced. One inspector questioned a colleague’s use of certain geographical terms despite its use in the new curriculum. They also got a bee in their bonnet about the fact our new foundation stage children hadn’t been shown how to sweep up sand from the sand tray; these were children who had been in school for 3 days!
      From my own experience the inspector I dealt with seemed to forget it was the very beginning of the academic year. She wanted to hear some of my children read and interrupted my teaching to ask me what their national curriculum levels were (bearing in mind I’d just inherited this class a week previously). She was livid when I pointed that out and when I said that the HA child she’d asked to take was around a 2B her response was ‘well, that’s average then.’ Yes it is, at the end of the academic year, not 1 week into it!

      Reply
      • 9th April 2015 at 12:33 pm
        Permalink

        That sounds like an absolute nightmare in comparison and the lack of consistency is a bone of contention and why Ofsted judgements are so contentious. If you had a reasonable set of inspectors (who should be made up of existing teachers too in my opinion as they know what is going on) then it would make the process fairer and clearer. Honestly – can’t get over the comments and as for interrupting your teaching and coming out with the 2B comment – nonsense isn’t it.

  • Pingback:Focus on The Fluffy Stuff by @TeacherToolkit | @TeacherToolkit

  • Pingback:#10Highlights as a Deputy Headteacher by @TeacherToolkit | @TeacherToolkit

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.