Is there life after levels?
I found the academic year 2014/15 the most stressful of all my time in teaching. As assessment lead in a small primary, a major part of my role over the last couple of years has been to lead the school team on what was being heralded, an ‘exciting journey’ beyond levels, but what was in effect, a ride that left us initially feeling vulnerable and very exposed.
… over the years, one that worked perfectly well for us (or so I thought) without knowing to what or to where we were moving.
I began to research – but the more I read, the more bewildering I found the situation. The reasons for abandoning levels were clear. I understood them to be:
- An undue emphasis on pace. The rate of progress, or how fast pupils moved through the levels had become more important than pupils’ understanding of the curriculum. It became expected that pupils exceeded the national expectation!
- The unsuitability of using “best fit” descriptors. A best fit is not always a secure fit and, coupled with the problem of undue pace through the levels, many pupils were moved on because they best fitted a descriptor but may have had major gaps in their understanding.
- The problem with determining levels by average marks on a test. A high number of marks could be gained from, for example, level 2 questions and some from level 4 questions, and yet, when averaged out, a pupil may have been awarded level 3. The pupil wasn’t really a level 3, but the marks may have declared them so.
- Jurisdictions that have high international rankings have never used a system of levels. Instead, assessment is based on “depth of understanding” or “mastery” of all of the key concepts of the curriculum.
(Taken from a presentation by Tim Oates, chair of the Expert Panel responsible for revising the national curriculum.)
The research made perfect sense and excited me – but many of the schools and individuals that I talked to had already jumped ship, replacing their levels with commercially produced packages full of tick-lists and half-termly data inputting.
Had I misunderstood the purpose?
Surely that [software] was replacing levels with something that looked like levels and worked like levels, but was just not called ‘levels’!
I had even heard of one large school that had bought into three of these different software packages, expecting their (overworked) teachers to trial all of them during one academic year. As a small school we simply couldn’t afford to do that!
And so we held our ground for the year, a large part of me hoping that wisdom and clear direction from a ‘higher’ source would emerge. This was a time during which OfSTED could ‘turn up at any minute’, as we were well overdue, but I didn’t know whether I was William Wallace or a meerkat on the Kalahari! Either way, I felt on constant alert and it was exhausting and stressful. We were in limbo.
However, by the summer term I had read enough to be confident in what assessment without levels should look and feel like. It stopped being so important that our system would be unique – that became the point! I had stumbled across @beyondlevels, gained expertise from the ‘ramblings’ of Michael Tidd and more importantly, had the overwhelming support of both my head teacher and a resilient group of staff, willing to embrace change.
“Less but better …”
Our system’s key features:
So, here are our plans and principles.
- Key Objectives will be our main drivers for our assessment for learning cycle. As a staff, we worked together to decide what our key objectives should be. We wanted a manageable number and this wouldn’t be done by hiding lots of small objectives together in a larger one! What really mattered? What were the needs of our pupils? What did we need to champion? What did we want our pupils to be able to do and understand by the time they left us? ‘Less but better’ was our mantra.
- We wanted our pupils to master the objectives – defining ‘expected’ as application of objectives – in different contexts and including reasoning and justification. Digging down to greater depth would require cross topic and curricular application. This would be evident in student books, lessons and conversations.
- We would continue with three pupil progress meetings throughout the year. Every child’s progress would be discussed and celebrated and next steps identified and supported.
- Monitoring would continue with rigour – with a focus upon mastery and an involvement of the whole staff. This year we have followed key objectives and strands from EYFS to Year 6 in staff meetings, sharing our work and books so we feel more secure with what our ‘expected’ looks like.
- We also still wanted the support of an objective test – as this is what the children/school would ultimately be judged by. We had used the CATs successfully for many years and didn’t want to lose parts of our previous system that had always worked very well for us, providing valuable information for teachers, parents, pupils and governors. We also wanted a package that could be used for the entire school. We therefore decided to buy into the complete GL Assessment package that offered annual progress tests in English, maths and science amongst other assessments. They were digital and so also answered a big workload issue – tests are analysed in-depth and reports created instantly! Annual progress could now also be monitored by standardised scores, compared with CAT test results – and of course with the crucial learning evident in books and lessons.
This is by no means a fait accompli, but we are pleased with the steps that we have made thus far. It’s working for us – and that, I think, is the point. Marrying our system to the year 2 and year 6 expectations remains a challenge, but whatever the national tests say, we know our children have made progress – and we can show it.
We need to continue to work as an honest and open team – it is our challenge and conversation that is creating the life after levels. We also need to continue to work with the secondary schools that we feed into, to find a common assessment language.